Previous topic

music21.webapps.apps

Next topic

music21.webapps.templates

Table Of Contents

Table Of Contents

This Page

music21.webapps.commands

Functions

music21.webapps.commands.checkLeadSheetPitches(worksheet, returnType='')

checker routine for hack day demo lead sheet chord symbols exercise. Accepts a stream with both the chord symbols and student’s chords, and returns the corrected stream. if returnType=answerkey, the score is returned with the leadsheet pitches realized

>>> worksheet = stream.Stream()
>>> worksheet.append(harmony.ChordSymbol('C'))
>>> worksheet.append(harmony.ChordSymbol('G7'))
>>> worksheet.append(harmony.ChordSymbol('B'))
>>> worksheet.append(harmony.ChordSymbol('D7/A')) 
>>> answerKey = webapps.commands.checkLeadSheetPitches( worksheet, returnType = 'answerkey' )
>>> for x in answerKey.notes:
...     [str(p) for p in x.pitches]
['C3', 'E3', 'G3']
['G2', 'B2', 'D3', 'F3']
['B2', 'D#3', 'F#3']
['A2', 'C3', 'D3', 'F#3']
music21.webapps.commands.colorAllChords(sc, color)

Iterate through all chords and change their color to the given color - used for testing color rendering in noteflight

music21.webapps.commands.colorAllNotes(sc, color)

Iterate through all notes and change their color to the given color - used for testing color rendering in noteflight

music21.webapps.commands.correctChordSymbols(worksheet, studentResponse)

Written for hackday demo: accepts as parameters a stream with chord symbols (the worksheet) and the student’s attempt to write out the pitches for each chord symbol of the worksheet. The student’s work is returned with annotations, and the percentage correct is also returned

>>> worksheet = stream.Stream()
>>> worksheet.append(harmony.ChordSymbol('C'))
>>> worksheet.append(harmony.ChordSymbol('G7'))
>>> worksheet.append(harmony.ChordSymbol('B-'))
>>> worksheet.append(harmony.ChordSymbol('D7/A')) 
>>> studentResponse = stream.Stream()
>>> studentResponse.append(clef.TrebleClef())
>>> studentResponse.append(chord.Chord(['C','E','G']))
>>> studentResponse.append(chord.Chord(['G', 'B', 'D5', 'F5']))
>>> studentResponse.append(chord.Chord(['B-', 'C']))
>>> studentResponse.append(chord.Chord(['D4', 'F#4', 'A4', 'C5']))
>>> newScore, percentCorrect = webapps.commands.correctChordSymbols(
...     worksheet, studentResponse)
>>> for x in newScore.notes:
...  x.lyric
':)'
':)'
'PITCHES'
'INVERSION'
>>> percentCorrect
50.0   

Return object.

music21.webapps.commands.createMensuralCanon(sc)

Implements music21 example of creating a mensural canon

music21.webapps.commands.determineDissonantIdentificationAccuracy(scoreIn, offsetList, keyStr=None)

runs comparison on score to identify dissonances, then compares to the user’s offsetList of identified dissonances. The score is colored according to the results, and appropriate information is returned as a dictionary. See runPerceivedDissonanceAnalysis for full details and an example.

Color key * Green: the user also recognizes this as a dissonant vertical slice GREEN * Red: the user did not recognize as a dissonant vertical slice RED * Blue: the user recognized it as a dissonant vertical slice BLUE

>>> s = stream.Score()
>>> p = stream.Part()
>>> c1 = chord.Chord(['C3','E3','G3'])
>>> c1.isConsonant()
True
>>> p.append(c1)
>>> c2 = chord.Chord(['C3','B3','D#'])
>>> c2.isConsonant()
False
>>> p.append(c2)
>>> c3 = chord.Chord(['D3','F#3','A'])
>>> c3.isConsonant()
True
>>> p.append(c3)
>>> c4 = chord.Chord(['B-4','F#4','A-3'])
>>> c4.isConsonant()
False
>>> p.append(c4)
>>> p.makeMeasures(inPlace=True)
>>> s.append(p)
>>> aData = webapps.commands.determineDissonantIdentificationAccuracy(s, [2.3,3.2])
>>> chords = aData['stream'].flat.getElementsByClass('Chord')
>>> chords[0].color == None #BLACK (by default)
True
>>> chords[1].color #RED
'#cc3300'
>>> chords[2].color #BLUE
'#0033cc'
>>> chords[3].color #GREEN
'#00cc33'
music21.webapps.commands.generateChords(numChords, kind='')

Randomly generate a score of chords for use with the perceived dissonances app. These chords may be dissonant or consonant. if kind = ‘diatonicTriads’, only diatonic triads will be generated

>>> sc = webapps.commands.generateChords(4,'diatonicTriads')
>>> a = webapps.commands.runPerceivedDissonanceAnalysis(sc,[1.2,3.2,5.2])
>>> chords = a['fullScore']['stream'].flat.getElementsByClass('Chord')
>>> chords[0].color != None
True
>>> chords[1].color != None
True
>>> chords[2].color != None
True
>>> chords[3].color in [None, '#cc3300']
True
>>> sc2 = webapps.commands.generateChords(4)
>>> a = webapps.commands.runPerceivedDissonanceAnalysis(sc2,[1.2,3.2])
>>> chords = a['fullScore']['stream'].flat.getElementsByClass('Chord')
>>> chords[0].color != None
True
>>> chords[1].color != None
True
>>> chords[2].color in [None, '#cc3300']
True
>>> chords[3].color in [None, '#cc3300']
True
music21.webapps.commands.generateIntervals(numIntervals, kind=None, octaveSpacing=None)
music21.webapps.commands.reduction(sc)
music21.webapps.commands.runPerceivedDissonanceAnalysis(scoreIn, offsetList, keyStr=None)

Perceived Dissonances: Demo app for NEMCOG meeting, April 28 2012

webapp for determining the accuracy of aural identification of dissonances the user listens to a piece of music and clicks when they think they hear a dissonance. this information is then passed to this method, which compares the score to the list of offsets corresponding to when the user clicked. Music21 then identifies the dissonant vertical slices, and outputs results as a dictionary including the score, colored by vertical slices of interest as below:

Green: both music21 and the user identified as dissonant Blue: only the user identified as dissonant Red: only music21 identified as dissonant

This example runs two analysis, the first is a comparison with the unmodified score and user’s offsets, the second with the passing tones and neighbor tones of the score removed. Results are returned as nested dictionaries of the following form: {fullScore , nonharmonicTonesRemovedScore} each of which is a dictionary containing these keys: {‘stream’, ‘numUserIdentified’, ‘numMusic21Identified’, ‘numBothIdentified’, ‘accuracy’, ‘romans’, ‘key’}

>>> piece = corpus.parse('bwv7.7').measures(0,3)
>>> offsetList = [
...     1.1916666666666667,
...     2.3641666666666667,
...     3.6041666666666665,
...     4.5808333333333335,
...     6.131666666666667,
...     8.804166666666667,
...     10.148333333333333,
...     11.700833333333334,
...     ]
>>> analysisDict = webapps.commands.runPerceivedDissonanceAnalysis(piece, offsetList)
>>> a = analysisDict['fullScore']
>>> a['numMusic21Identified']
7
>>> a['numBothIdentified']
3
>>> a['numUserIdentified']
8
>>> a['romans']
['v43', 'iio65', 'bVIIb73']
>>> b = analysisDict['nonharmonicTonesRemovedScore']
>>> b['numMusic21Identified']
5
>>> b['numBothIdentified']
2
>>> b['accuracy']
40.0 

Return dictionary.

music21.webapps.commands.writeMIDIFileToServer(sc)

Iterate through all notes and change their color to the given color - used for testing color rendering in noteflight