User Tools

Site Tools


analysis:nsb2014:week0

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
analysis:nsb2014:week0 [2014/07/20 09:28]
mvdm [Introduction: Principles of careful data analysis]
analysis:nsb2014:week0 [2018/07/07 10:19] (current)
Line 15: Line 15:
 //An important corollary of this principle is that you need to determine at every step whether you are dealing with garbage or not//. Two habits that help with this are visualization (explored in [[analysis:​nsb2014:​week5|Module 5]]) and unit testing (put simply, the practice of testing specific pieces of functionality or "​units";​ employed throughout the modules). ​ //An important corollary of this principle is that you need to determine at every step whether you are dealing with garbage or not//. Two habits that help with this are visualization (explored in [[analysis:​nsb2014:​week5|Module 5]]) and unit testing (put simply, the practice of testing specific pieces of functionality or "​units";​ employed throughout the modules). ​
  
-To see why this principle is critical, consider a complex multistep experimental procedure such as implanting a recording probe. In this setting, the surgeon //always// verifies the success of the previous step before proceeding. One would never attempt to insert a probe without making sure dura is removed first. Or apply dental cement to the skull without first making sure it is dry. Apply the same mindset to analysis and confirm the success of every step before proceeding!+To see why this principle is critical, consider a complex multistep experimental procedure such as surgically ​implanting a recording probe into the brain. In this setting, the surgeon //always// verifies the success of the previous step before proceeding. One would never attempt to insert a probe without making sure dura is removed first; or apply dental cement to the skull without first making sure it is dry. Apply the same mindset to analysis and confirm the success of every step before proceeding!
  
-=== 2. Plan from Raw data to Result ​===+=== 2. Plan ahead (from raw data to result) ​===
  
-Before the start of data collection, you should identify the steps in your data processing "​pipeline"​ -- that is, the flow from raw data to the figures ​and results ​in a beautiful ​publication. Doing this can often highlight key dependencies and potentially important controls that help you collect the data in such that you will be in position to test what you set out to do. +Before the start of data collection, you should identify the steps in your data processing "​pipeline"​ -- that is, the flow from raw data to the figures in the resulting ​publication. Doing this can often highlight key dependencies and potentially important controls that help you collect the data such that you will can actually ​test what you set out to do.  
 + 
 +This sort of planning is especially important when performing experiments with long timelines that are not easily changed, such as when chronically implanting animals for in vivo recording, where it may take up to two months to collect data from a single animal. For smaller projects or those with faster iteration times (e.g. a new slice every day) you can be more flexible.
  
 There are two steps to this planning process: There are two steps to this planning process:
  
-**First**, think in terms of data, and transformations on those data, to create a schematic that captures each data type and the associated transformations+**First**, think in terms of data, and transformations on those data, to create a schematic that illustrates your analysis workflow at a conceptual level.
  
-For instance, to determine whether the number of sharp wave-ripple complexes (SWRs) that occur depends on an experimental manipulation,​ a possible analysis ​pipeline or workflow might be represented as follows (generated with [[http://​www.graphviz.org/​|GraphViz]]):​+For instance, to determine whether the number of sharp wave-ripple complexes (SWRs; these are candidate "​replay"​ events in the hippocampus) that occur depends on an experimental manipulation,​ a possible analysis workflow might be represented as follows (generated with the [[https://​www.dokuwiki.org/​plugin:​graphviz|DokuWiki plugin]] for [[http://​www.graphviz.org/​|GraphViz]]):​
  
 <​graphviz>​ <​graphviz>​
Line 55: Line 57:
 </​graphviz>​ </​graphviz>​
  
-The above workflow shows how raw local field potential (LFP) data is first loaded (by the ''​LoadCSC()''​ function) and then filtered (''​FilterLFP()''​). Note that at this stage, you can simply make up function names, as long as they are descriptive (see Principle 3, below). Next, SWRs events are detected from the filtered LFP, and the number for each trial counted before applying a statistical test.+The above workflow shows how raw local field potential (LFP) data is first loaded (by the ''​LoadCSC()''​ function) and then filtered (''​FilterLFP()''​). Note that at this stage, you can simply make up function names, as long as they are descriptive (see Principle 3, below). Next, SWR events are detected from the filtered LFP, and the number for each trial counted before applying a statistical test.
  
-The square brackets such as %%[TSD]%% refer to standardized data types, introduced in [[analysis:​nsb2014:​week2|Module 2]]. Briefly, a TSD object describes one or more time-varying signals (such as LFP or videotracker data), an IV object describes interval data (such as SWR events, which have a start and end time as well as some properties such as their power), and a TS object describes timestamps (for instance spikes). By standardizing the form in which these data types are handled, we can more easily implement unit tests and write clean, modular code.+The square brackets such as %%[TSD]%% refer to standardized data types, introduced in [[analysis:​nsb2014:​week2|Module 2]]. Briefly, a TSD object describes one or more time-varying signals (such as LFP or videotracker data), an IV object describes interval data (such as SWR events, which have a start and end time as well as some properties such as their power), and a TS object describes timestamps (such as spike times). By standardizing the form in which these data types are handled, we can more easily implement unit tests and write clean, modular code.
  
-**Second**: based on a data analysis workflow such as the above, write out the pseudocode that would implement ​it in MATLAB. For the workflow above, this might look something like:+**Second**: based on a data analysis workflow such as the above, write out some example ​pseudocode that would implement ​the analysis ​in MATLAB. For the workflow above, this might look something like:
  
 <code matlab> <code matlab>
Line 77: Line 79:
 </​code>​ </​code>​
  
-Note that each analysis step is implemented by a function, with a ''​cfg''​ struct to specify some parameters of the transformation (e.g. the frequency band to filter). The overall workflow is accomplished by calling the appropriate functions on evolving data types. Perhaps some of the functions you need already exist, or you may need to write some of them. Either way, making the analysis steps explicit in this way provides a good starting point for writing well organized code. +Note that each analysis step is implemented by a function, with a ''​cfg''​ struct to specify some parameters of the transformation (e.g. the frequency band to filter). The overall workflow is accomplished by calling the appropriate functions on evolving data types. Perhaps some of the functions you need already exist, or you may need to write some of them. Either way, making the analysis steps explicit in this way provides a good starting point for writing well-organized code.
- +
-=== 3. Good programming practice ===+
  
-There are many resources and opinions on what constitutes ​good programming practice, but some of the most important ideas are:+=== 3. Use good programming practice ​===
  
-  * Don't repeat yourselfImplementing each piece of functionality only once means your code will be easier to troubleshoot,​ re-use, and extend. +There are [[http://​stackoverflow.com/​questions/​550861/​improving-code-readability | many]] resources ​and opinions on what constitutes good programming practiceA few of the most important are:
-  * Readability. Whatever analysis you do, you will have to do it again. Maybe tomorrow, maybe next year. You might think you will remember ​what you did and why, but you probably won't. Even if somehow you do, it's likely someone else will have to run and understand your code. Whether or not they can will reflect on you. So, use expressive variable and function names. Comment a lot. Write example workflows.+
  
 +  * //​Don'​t repeat yourself//. Implementing each piece of functionality only once means your code will be easier to troubleshoot,​ re-use, and extend -- as well as easier to read.
 +  * //Unit testing//. Provide test scenarios with key pieces of code where you know what the expected outcome is. For data analysis this commonly involves generating artificial data such as white noise or Poisson spike trains of a certain average firing rate. These tests will be extremely helpful in interpreting your data later, and to check if changes you make to the code have not broken its functionality.
 +  * //​Readability//​. Generally, whatever analysis you are doing, you will probably have to do again. Maybe on the same data after you make a change to the code, maybe after you collect more data. Maybe tomorrow, maybe next year. It is tempting to assume you will remember what you did and why, but this will not always be the case! Plus, even if //you// do, it's likely someone else (such as your adviser, or a collaborator) will have to run and understand your code. Whether or not they can will reflect on you. 
 +  * //​Consistency//​. Use consistent naming schemes for different kinds of variables and functions; always place constants and parameters at the start of each file.
 === 4. Write to share === === 4. Write to share ===
  
analysis/nsb2014/week0.1405862926.txt.gz · Last modified: 2018/07/07 10:19 (external edit)