Beginning with the first portion of the Tom Petty song, the following is a spectrum compiled from time 18 to 25 seconds.
The spectrum is clearly more crowded than in the single guitar case from part 2. The bass guitar is visible around 50 Hz while the vocals are intermixed with the acoustic guitar at the higher frequencies. Luckily, most songs follow the basic rules of music theory in that the vocal frequencies match the guitar note frequencies such that a proper chord is maintained between them. The output of this time window is shown below. Note that t = 0 is at 18 seconds in the song.
Time = 0.0, Chord = B , Notes = B
Time = 0.3, Chord = E , Notes = B E
Time = 1.0, Chord = B , Notes = B
Time = 1.8, Chord = Esus, Notes = A B
Time = 2.8, Chord = Esus, Notes = A B
Time = 3.6, Chord = E , Notes = E
Time = 4.9, Chord = F# , Notes = F#
Time = 5.6, Chord = A , Notes = A E
This performance is quite poor. The proper chord progression in the song at this point in time was E - A - E - A. Examining the underlying notes however, we can see that the algorithm did correctly measure parts of the chord, but not enough to properly reconstruct it. The E chord has a B in it, and obviously the A chord has an A in it. The F# is a total mystery.
Moving on to Green Day's "When its Time", the following is a spectrum from the first 11 seconds of the song.
[caption id="attachment_263" align="aligncenter" width="366" caption="Green Day - When it's Time, time 0 to 11 seconds"][/caption]
Note that there is nothing at 50 Hz where the bass guitar was in the last song. This is because the beginning of "When its Time" is acoustic and vocals only. The following is the program output for the same time period.
Time = 0.0, Chord = G , Notes = D G
Time = 1.5, Chord = G , Notes = B D G
Time = 3.3, Chord = D , Notes = A D F#
Time = 4.1, Chord = D , Notes = D
Time = 5.9, Chord = G , Notes = B D
Time = 9.7, Chord = G , Notes = G
This shows some improvement. The proper chord progression for this time period in the song is G - D/F# - Em - C. The Em is missed, but the ending C chord is detected as a G note (the fifth in C). This performance though leaves much to be desired.
The spectrum for the final song, Coldplay's "Yellow" is shown below. This spectrum is for a time window of 33 to 50 seconds (the first main verse), showing guitar, bass, drums, and lyrics.
[caption id="attachment_264" align="aligncenter" width="366" caption="Coldplay - Yellow, time 33 to 50 seconds"][/caption]
The following is the program output for that same time period.
Time = 0.0, Chord = B , Notes = B F#
Time = 2.3, Chord = B , Notes = B F#
Time = 3.8, Chord = B , Notes = B F#
Time = 4.4, Chord = NA , Notes = B C F#
Time = 5.4, Chord = B , Notes = B F#
Time = 6.1, Chord = F# , Notes = C# F#
Time = 6.7, Chord = F# , Notes = F#
Time = 10.0, Chord = F# , Notes = C# F#
Time = 10.5, Chord = F# , Notes = F#
Time = 11.3, Chord = NA , Notes = E F F#
Time = 11.8, Chord = NA , Notes = E F
Time = 12.5, Chord = Bsus, Notes = E F#
Time = 13.1, Chord = E , Notes = E
Time = 14.1, Chord = NA , Notes = B E F
Time = 15.4, Chord = E , Notes = B E
Time = 16.1, Chord = NA , Notes = E F
The performance here is actually quite good, as the algorithm managed to pick out all three chords in the progression B - F# - E. There is some trouble with the E chord, as the neighboring F note is being incorrectly measured at times. This note is only a half step away and could be the result of difficulty in measuring the E note cleanly within the presence of noise.
The algorithm performance against these three songs shows some promise, but the performance and accuracy clearly needs to be improved. In the next and final step, we will improve the accuracy of the program to the point where it can reliably reconstruct the chords in these same three songs. Additional time windows will be measured as well to demonstrate performance at different points within the songs.