User Tools

Site Tools


analysis:course:week7

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

analysis:course:week7 [2013/10/27 17:45]
mvdm [Cross-correlation for continuous signals]
analysis:course:week7 [2018/07/07 10:19]
Line 1: Line 1:
-===== Time-frequency analysis II: second-order relationships,​ phase-amplitude relationships ===== 
  
-Goals: 
- 
-  * 1 
-  * 2 
- 
-Deliverables:​ 
- 
-  * 1 
- 
-Resources: 
- 
-  * 1 
-  * 2 
- 
-==== Introduction ==== 
- 
-In the previous module we saw that a //​spectrogram//​ can determine the frequency content of a signal over time. Often, spectrograms resulting from brain signals reveal rich patterns of interacting frequency bands. A simple example of this is the previously mentioned panel C of the first figure in the previous module: this shows that in the rat ventral striatum, gamma-50 and gamma-80 power tend to be slightly anticorrelated. More generally, we might ask, what structure is there in the spectrogram?​ What relationships exist between different frequency bands? Can we predict power based on the history of the signal? Beyond the power (amplitude) components of the spectrogram,​ an additional dimension is //phase//, which may enter in systematic relationships with power. For instance, in the hippocampus,​ gamma power tends to occur at certain phases of the slower theta (~8Hz) rhythm (Colgin et al. 2009). Such "​second-order"​ relationships (built on the "​first-order"​ spectrogram) are the focus of this module. 
- 
-==== Reminder: correlation ==== 
- 
-A simple yet powerful tool for characterizing dependencies between variables -- such as power in one frequency band and another -- is Pearson'​s product-moment correlation coefficient (or "​correlation coefficient"​ in short), which for a finite sample can be computed as follows: 
- 
-{{ http://​upload.wikimedia.org/​math/​e/​d/​f/​edff63534d48d2756b11b969391bf60e.png?​400 |}} 
- 
-where x and y are the signals to be correlated, with means of x-bar and y-bar respectively. In other words, it is the covariance between X and Y divided by the product of their standard deviations. It is 0 for independent (uncorrelated) signals, -1 for perfectly anticorrelated signals, and 1 for perfectly correlated signals. 
- 
-Note that the correlation coefficient only captures //linear// relationships between variables, as can be seen from these scatterplots:​ 
- 
-{{ http://​upload.wikimedia.org/​wikipedia/​commons/​thumb/​d/​d4/​Correlation_examples2.svg/​400px-Correlation_examples2.svg.png?​400 |}} 
- 
-The MATLAB function ''​corrcoef()''​ computes the sample correlation coefficient,​ and in addition provides the probability of obtaining the reported //r// value under the null hypothesis that the signals are independent ("​p-value"​):​ 
- 
-<code matlab> 
-x = randn(100,​4); ​ % uncorrelated data - 4 signals of 100 points each 
-x(:,4) = sum(x,​2); ​  % 4th signal is now correlated with first 3 
-[r,p] = corrcoef(x) ​ % compute sample correlation and p-values; notice signal 4 is correlated with the others 
-imagesc(r) % plot the correlation matrix -- note symmetry, diagonal, and r values of ~0.5 for signal 4 
-</​code>​ 
- 
-''​corrcoef()''​ computes the correlation coefficients and p-values for every pair of signals in a matrix -- in this case, for 4 signals. We can see from its output that as expected, signal 4 is correlated with every other signal. Note also that, of course, the correlation of any signal with itself (the values along the diagonal) is 1. 
- 
-Armed with this tool, we can compute the correlation matrix of a ventral striatal spectrogram:​ 
- 
-<code matlab> 
-% remember to cd to data first 
-fname = '​R016-2012-10-03-CSC04a.Ncs';​ 
-csc = myLoadCSC(fname);​ 
- 
-cscR = Restrict(csc,​2700,​3300);​ % risk session only 
-Fs = 2000; 
- 
-[S,F,T,P] = spectrogram(Data(cscR),​hanning(512),​384,​1:​0.25:​200,​Fs);​ % spectrogram 
- 
-[r,p] = corrcoef(10*log10(P'​));​ % correlation matrix (across frequencies) of spectrogram 
- 
-% plot 
-imagesc(F,​F,​r); ​ 
-caxis([-0.1 0.5]); axis xy; colorbar; grid on; 
-set(gca,'​XLim',​[0 150],'​YLim',​[0 150],'​FontSize',​14,'​XTick',​0:​10:​150,'​YTick',​0:​10:​150);​ 
-</​code>​ 
- 
-You should get: 
- 
-{{ :​analysis:​course:​week7_fig1.png?​600 |}} 
- 
-As we would expect, nearby frequencies tend to be correlated. The blob around 80-100Hz indicates that these frequencies tend to co-occur. In contrast, there is a dark patch centered around (60,95) indicating that power in those frequencies is slightly anti-correlated. Note also the positive correlation around (20,60) suggesting that beta and low-gamma power co-occur. This pattern in the cross-frequency correlation matrix is characteristic for ventral striatal LFPs (van der Meer and Redish, 2009). 
-=== Autocorrelation for continuous signals === 
- 
-The above analysis focused on the //​co-occurrence//​ of power across different frequencies;​ it asks, "when power at some frequency is high, what does that tell me about the power at other frequencies //at that same time//"? ​ 
- 
-However, often we are also interested in relationships that are not limited to co-occurrence -- for instance, it is possible that power at a certain frequency predicts that **some time later** a power increase in some other band will occur. To characterize such relationships,​ which involve a time lag, we first need the concept of //​autocorrelation//​. 
- 
-== Autocorrelation on synthetic signals == 
- 
-The correlation of a signal with itself is 1 by definition. However, what about the correlation of a signal with the same signal, //one sample later//? For white noise, where each sample is generated independently,​ this should be zero: 
- 
-<code matlab> 
-wnoise = randn(1000,​1);​ 
-[r,p] = corrcoef(wnoise(1:​end-1),​wnoise(2:​end)) 
-</​code>​ 
- 
-Note that when passing two arguments to ''​corrcoef()''​ the output is a 2x2 matrix, where the diagonals are 1 and the off-diagonal elements are of interest. 
- 
-We can generalize this example to computing correlations not just for a lag of one sample, as above, but for //any// lag: 
- 
-<code matlab> 
-[acf,lags] = xcorr(wnoise,​100,'​coeff'​);​ 
-plot(lags,​acf,'​LineWidth',​2);​ grid on; 
-set(gca,'​FontSize',​18);​ xlabel('​time lag (samples)'​);​ ylabel('​correlation ({\itr})'​);​ 
-</​code>​ 
- 
-You should get 
- 
-{{ :​analysis:​course:​week7_fig2.png?​600 |}} 
- 
-As expected, the autocorrelation of white noise is essentially zero for any lag (except 0). 
- 
-What if we try a signal that is not just white noise, but has a clear periodicity?​ 
- 
-<code matlab> 
-Fs = 500; dt = 1./Fs; 
-tvec = 0:dt:2-dt; 
- 
-wnoise = randn(size(tvec));​ 
-sgnl = sin(2*pi*10*tvec) + wnoise; % correlated signal 
- 
-subplot(211);​ 
-plot(tvec,​sgnl);​ grid on; 
-set(gca,'​FontSize',​18);​ title('​signal'​);​ 
- 
-subplot(212);​ 
-[acf,lags] = xcorr(sgnl,​100,'​coeff'​);​ 
-lags = lags.*dt; % convert samples to time 
-plot(lags,​acf);​ grid on; 
-set(gca,'​FontSize',​18);​ xlabel('​time lag (s)'); ylabel('​correlation ({\itr})'​);​ 
-title('​autocorrelation'​);​ 
-</​code>​ 
- 
-Now we get 
- 
-{{ :​analysis:​course:​week7_fig3.png?​600 |}} 
- 
-The autocorrelation now has peaks at 0.1s, corresponding to the 10Hz component in the original signal. An intuitive interpretation of this plot is: given a pak in the signal at time t = 0, what on average are the expected values at other times? We can see that if there is a peak at 0, then we expect peaks at 0.1, 0.2, etc. and troughs at 0.5, 0.15... . 
- 
-☛ What would you expect the autocorrelation to look like if the above code were modified to use a //​rectified//​ sine wave (''​abs(sin(..))''​)?​ Verify your prediction. When you do, note that the periodicity becomes difficult to see by eye in the raw signal, yet the acorr pulls it out. 
- 
-== Application to real data == 
- 
-Is there any periodicity in ventral striatal gamma power? That is, given high power at t = 0, what values do we expect at other times? If gamma power varies randomly, then we expect the acorr to look like that for white noise. If, on the other hand, high gamma power at time 0 tells us something about what powers to expect at other times, we will see something different. 
- 
-First, let's get a finer-resolution version of the spectrogram:​ 
- 
-<code matlab> 
-% spectrogram with better time resolution (should use ft; note is slow) 
-Fs = 500; 
-d = decimate(Data(cscR),​4);​ 
-[S,F,T,P] = spectrogram(d,​hanning(125),​115,​1:​200,​Fs);</​code>​ 
- 
-What we need is a time series of gamma power, so that we can compute its autocorrelation. We will get this from the spectrogram,​ but it is worth noting that this is not the only way to obtain such a signal. Recall, for instance, that in module 5 we used a filter to compute instantaneous power in the ripple band. An advantage of using the spectrogram is that it provides estimates of power over time across all frequencies,​ so it is easy to explore. However, more precise estimates can be obtained with the filtering method (as well as with better spectrograms such as the event-averaged one we computed with FieldTrip in module 6). 
- 
-For now we make do with what we have: 
- 
-<code matlab> 
-F_idx = find(F > 70 & F < 100); % high-gamma power 
-F_idx2 = find(F > 50 & F < 65); % low-gamma power 
- 
-pwr = mean(P(F_idx,:​));​ % average across frequencies 
-pwr2 = mean(P(F_idx2,:​));​ 
- 
-[ac,lags] = xcorr(pwr2,​pwr2,​50,'​coeff'​);​ 
-lags = lags.*mean(diff(T));​ 
-figure; 
-plot(lags,​ac) 
-set(gca,'​FontSize',​18);​ xlabel('​time lag (s)'); ylabel('​correlation ({\itr})'​);​ 
-title('​low-gamma autocorrelation'​);​ 
-</​code>​ 
- 
-You should get: 
- 
-{{ :​analysis:​course:​week7_fig4.png?​600 |}} 
- 
-Some differences with our reference acorss (from white noise, and the 10Hz sine wave) are apparent. First, the central peak around t = 0 now has a certain nonzero width: if we have a high low-gamma power at time t = 0, power tends to stay elevated for a certain amount of time. Judging from the inflection point at around 0.2s, it appears that if low gamma power is high, it takes about 0.2s to return to a kind of baseline. 
- 
-☛ What do you expect would happen to the width of this central peak if you were to change the width of the time window used to compute the spectrogram,​ say to twice the original width? 
- 
-Second, a small dip in the baseline is apparent around 0.6s. This suggests that if we have high low-gamma power at t = 0, we are less likely than average to see high gamma power 0.6s later. 
-  
-=== Cross-correlation for continuous signals === 
- 
-The //​autocorrelation//​ computes the correlation of a signal with itself over some different time lags. The //​cross-correlation//​ does the same but between two //​different//​ signals. It asks: given a peak in signal 1 at time zero, what do we know about signal 2? 
- 
-Let's apply this idea to ventral striatal gamma power by asking how high-gamma power varies given a peak in low-gamma power at t = 0: 
- 
-<code matlab> 
-F_idx = find(F > 70 & F < 100); % high-gamma power 
-F_idx2 = find(F > 50 & F < 65); % low-gamma power 
- 
-pwr = mean(P(F_idx,:​));​ % average across frequencies 
-pwr2 = mean(P(F_idx2,:​));​ 
- 
-[ac,lags] = xcorr(pwr,​pwr2,​50,'​coeff'​);​ 
-lags = lags.*mean(diff(T));​ 
-figure; 
-plot(lags,​ac) 
-set(gca,'​FontSize',​18);​ xlabel('​time lag (s)'); ylabel('​correlation ({\itr})'​);​ 
-title('​xcorr (high-gamma relative to low-gamma at t = 0)'); 
-</​code>​ 
- 
-You should see: 
- 
-{{ :​analysis:​course:​week7_fig5.png?​600 |}} 
- 
-A clear peak in the xcorr is visible: high-gamma power tends to be maximal about 0.25s //before// a peak in low-gamma power. Note that the xcorr is asymmetric: a low-gamma peak does not predict elevated high-gamma power to follow as strongly as in the other direction. In other words, if there is strong high-gamma, then low-gamma is likely to follow, but the reverse is the case to a lesser extent (if at all). This result is consistent with the pattern we saw in the spectrogram in the previous module. 
analysis/course/week7.txt · Last modified: 2018/07/07 10:19 (external edit)