pubPhotos

What is music21?

Music21 is a set of tools for helping scholars and other active listeners answer questions about music quickly and simply. If you’ve ever asked yourself a question like, “I wonder how often Bach does that” or “I wish I knew which band was the first to use these chords in this order,” or “I’ll bet we’d know more about Renaissance counterpoint (or Indian ragas or post-tonal pitch structures or the form of minuets) if I could write a program to automatically write more of them,” then music21 can help you with your work.

How simple is music21 to use?

Extremely. After starting Python and typing "from music21 import *" you can do all of these things with only a single line of music21 code:

Display a short melody in musical notation:
converter.parse("tinynotation: 3/4 c4 d8 f g16 a g f#").show()

Print the twelve-tone matrix for a tone row (in this case the opening of Schoenberg's Fourth String Quartet):
print (serial.rowToMatrix([2,1,9,10,5,3,4,0,8,7,6,11]) )

or since all the 2nd-Viennese school rows are already available as objects, you can type:
print (serial.getHistoricalRowByName('RowSchoenbergOp37').matrix() )

Convert a file from Humdrum's **kern data format to MusicXML for editing in Finale or Sibelius:
converter.parse('/users/cuthbert/docs/composition.krn').write('musicxml')

With five lines of music21 code or less, you can:

Prepare a thematic (incipit) catalog of every Bach chorale that is in 3/4:

catalog = stream.Opus()
for workName in corpus.chorales.Iterator():
  work = converter.parse(workName)
  firstTS = work.recurse().getElementsByClass('TimeSignature')[0]
  if firstTS.ratioString == '6/8':
    catalog.append(work.measures(0, 2))
catalog.show()

Google every motet in your database that includes the word ‘exultavit’ in the superius (soprano) part (even if broken up as multiple syllables in the source file) to see how common the motet's text is (assuming you have a bunch of motets in "listOfMotets"):

import webbrowser
for motet in listOfMotets:
  superius = motet[0]
  lyrics = text.assembleLyrics(part)
  if 'exultavit' in lyrics:
    webbrowser.open('http://www.google.com/search?&q=' + lyrics)

Add the German name (i.e., B♭ = B, B = H, A♯ = Ais) under each note of a Bach chorale and show the new score:

bwv295 = corpus.parse('bach/bwv295')
for thisNote in bwv295.recurse().notes:
  thisNote.addLyric(thisNote.pitch.german)
bwv295.show()

Of course, you are never limited to just using five lines to do tasks with music21. In the demos folder of the music21 package and in the sample problems page (and throughout the documentation) you’ll find examples of more complicated problems that music21 is well-suited to solving, such as cataloging the rhythms of a piece from most to least-frequently used.

Music21 builds on preexisting frameworks and technologies such as Humdrum, MusicXML, MuseData, MIDI, and Lilypond but music21 uses an object-oriented skeleton that makes it easier to handle complex data. But at the same time music21 tries to keep its code clear and make reusing existing code simple. With music21 once you (or anyone else) has written a program to solve a problem, that program can easily become a module to be adapted or built upon to solve dozens of similar (but not identical) problems.

Interested in learning more?

Latest music21 News

[August 6, 2017 17:11 pm] « » [music21]
The first release of music21 version 4 was released today (August 6, 2017).  This is the first major release of music21 in a year, and it brings with it a wealth of new tools for analyzing music with a computer and performing digital musicology, music theory, and composition.

Download from https://github.com/cuthbertLab/music21/releases or from the terminal, type:

    pip3 install --upgrade music21

(or without the "3" if you are using Python 2)

Version 4 is the last version of music21 that will support Python 2.7.  If you run Version 4 on Python 2.7, you will see a warning that it's time to move up to the brilliance that is Python 3.6.

As with all new "X" release names, v.4 has backward incompatible behaviors that I think are worth it for the great new features. 
Among the 272 commits since v.3.1:
Major new features:
  • Graphing rewrite!
.plot() and Graphing has always been one of the most powerful parts of music21 since long before v.1.0 (mad props, Christopher Ariza!) but it's also been one of the most daunting aspects of using music21. It shouldn't be any more. The code has gone through a major rewrite to improve the simplicity of doing easy things and the power when doing difficult things. The easy things are documented in Chapter 22 and the hard things in Chapter 44.
  • Local Corpora are great!
There's been a major rewrite of the corpus.corpora.LocalCorpus() function that makes it a fantastic way to work quickly with files you are working on from your own hard drive.  Set up your own local corpus, add paths to it, set the cacheFilePath to somewhere to store the metadata and you'll be able to search for pieces with particular features and metadata without needing to parse the score.

  • Style!
the all new style module and style.Style object handles aspects of a note or other object's visual display that are not (usually) semantic. This class has allowed a major increase in the ability to properly preserve MusicXML visual formatting on input and export.
Style objects are created only when needed, so el.hasStyleInformation() allows for checking for the presence of a .style object without creating one.
(To be documented more soon)
  • Major rewrite of TinyNotation allows for easy extensibility.
Documented in the User's Guide! Check it out!
  • Always improving docs
The User's Guide goes up to chapter 24 now, with major new examples in Chapter 20 along with rewritten chapters on keys, time signatures, sorting, and so on and so on. Plus all examples are now Retina quality for viewing fine details of scores.
Documentation has been moved out of the music21 directory into the root directory -- it is no longer installed with music21 from pip -- this change was necessitated by the move to retina quality graphics, but reduces the installation size from 90MB to 15MB for the full corpus version and 6MB for the no-corpus version.  All documents are now tested with nbval, to ensure they stay up to date.
Other new features
  • Stream.measures() now optionally allows for indexes (where 0 always is the first measure; -1 is the last, and so on) making .getElementsByClass('Measure') not necessary in most cases.
  • much better metadata processing in musicxml, humdrum, and braille
  • improved braille translation (tuplets) -- thanks Bo-Chen
  • better beaming, meter, and tuplets in ABC
  • output directly to PDF if MuseScore is installed.
  • Nested Tuplets! including in MusicXML.
  • Non-traditional key signatures
  • New works by Clara Schumann in the corpus.
  • stream.iterators.OffsetIterator() -- iterate groups of objects by offset.
  • improvements to analysis.discrete
  • demos/build_melody shows how to build MidiFile directly (thanks PeterMitrano!)
  • corpus paths are now searchable in corpus.search()
  • matplotlib and musescore graphics in Jupyter notebook are now retina quality.
  • Chord.add() and Chord.remove() allow for direct manipulation of chords.
  • Improvements to parallel processing in music21.
  • Ottava spanners now come in two types, transposing and non-transposing -- reflecting whether the pitches under the spanner already reflect the transposition (non-transposing) or not.
  • Palestrina humdrum has been reorganized and parses completely.
  • Many improvements to spanners and RomanNumerals.
  • Beams work much better and transfer in and out of MusicXML more completely.
  • Every MusicXML 3.0 articulation is now supported.
  • Core routines in stream.core have now been exposed publicly.  They are dangerous to use, but for anyone working on their own parsers, they can speed up insertions and appends by an order of magnitude.
Others, including bugs squashed:
  • Warning on Python 2 that music21 v. 4 is the last version to support Py 2.
  • Stream.template() is a great way to get an empty stream that otherwise matches the current Stream.  Replaces the obsolete .measureTemplate()
  • ABC key signature and mode error fixed.
  • RecursiveIterator gets a .currentOffsetInHierarchy, which can let even more places remove the dependence on .flat.  In fact, .flat uses the .recurse() method internally, because recurse() is now so fast.
  • AudioSearch bugs fixed (thanks jjrob13)
  • Chord.normalOrderString (thanks emzhang)
  • Removed lots of old crutches including the "analysisData" on Stream, Note.editorial, and others. Style fixes most of this.
  • fix to Bach BWV 386 (thanks alexcoplan) and to Beethoven Opus 59 no 3, movement 1.
  • Note.pitches returns a tuple not list, just like Chord.pitches
  • Converter can deal with some wrong file extensions now.
  • Instrument reprs are fixed
  • configure finds many more notation programs.
  • ties are imported better between elements in and out of voices in musicxml
  • configure works on macOS when user directory contains spaces.
  • Bugs in ending and restarting a recursiveIterator fixed.
  • doc errors fixed (thanks Andrew Sanchez)
  • bestClef() has been moved to the clef module where it belongs.
  • MusicXML sound tag now is placed properly (thanks Almog Cohen)
  • MIDI output from transposed scores now plays in concert pitch.
  • Breves are now acceptable as full measure rests.
Deprecations and deprecated elements removed
  • (this list does not contain changes to the alpha/ directory which can change at any time)
  • Note.ps, and Note.accidental, Note.pitchClass, Note.pitchClassString, Note.diatonicNoteNum, and Note.microtone are all deprecated. Use Note.pitch.ps, etc. instead.
  • Chord.normalForm is deprecated because it gave the wrong answer. use normalOrder instead; same with normalFormString
  • SpannerBundle.list is deprecated; use list(SpannerBundle) instead
  • with the advent of .style, el.color is deprecated, use el.style.color instead
  • Stream.stream() is deprecated -- now that the transition to iterators is done, there should be no need for this.
  • REMOVED stream.getOffsetByElement; use s.elementOffset(el) instead.
  • REMOVED stream.haveBeamsBeenMade; use stream.streamStatus.haveBeamsBeenMade
  • REMOVED stream.makeTupletBrackets(); use stream.makeNotation.makeTupletBrackets(s)
  • REMOVED stream.realizeOrnaments; use stream.makeNotation.realizeOrnaments(s)
  • REMOVED VirtualCorpus -- it may return at some point but with a lot more features.
  • nbconvert is no longer packaged with music21
  • .exe files are no longer generated -- they were rarely used and pip is a better choice for Windows users now.

As the last version of music21 to run on Python 2, version 4 will have a longer support period for security patches and major bug fixes that render large parts of the system unusable for multiple users. This LTS (Long-term support) will run until the clock on python 2 runs out. (Currently two more years and eight months).  After that time, Python 2 will no longer be supported by the Python Foundation, and thus, by music21 either.  We're looking forward to joining the glorious Python 3-only feature and finally get to use some great features to make development faster and more stable.  Music21 v. 5 will be released at about the same time Django v. 2 (Py3 only) and a year after IPython/Jupyter made their first Python 3 only release.  Matplotlib will soon follow.  Python 3 is the future of humanistic and scientific programming.
As always, we thank the community for great support.  We'd always love to hear how you use music21 via the Google Groups mailing list.  Music21 was made possible by grants from the Seaver Institute, the National Endowment for the Humanities, and the School of Humanities, Arts, and Social Sciences / Music and Theater Arts Section at MIT.
[August 22, 2016 12:52 pm] « » [music21]
Version 3 of music21 is here!  This is the first major release in 11 months, with nearly 600 commits since the last version.  As a new major version, there are both huge new features as well as significant (often backwards incompatible) changes.

Upgrades for most users should work automatically by typing:

   pip3 install --upgrade music21

or (Python 2)

   pip install --upgrade music21

Or download from GitHub

A summary of the most important changes since v.2.2 are below:

Fun and easy to understand changes:

Lots more MusicXML support -- better support for time signatures, for grace notes, for spanners, for metadata, for...you name it!  And the system is refactored in such a way as to make contributing missing features quite easy.

MIDI files play back in Jupyter/IPython contexts.  Lots of improvements there for people who use MuseScore.

The User's Guide has become much more awesome, and you can play with all the good features there.

Many new ways to search scores -- LyricSearcher is fully polished and documented in the User's Guide.  Carl Lian's search.serial is upgraded from alpha to a full release that will be easy to expand in the future -- want to know where certain motives are used or transformed? This will make it easy to do. And search.segment -- the old standby -- is even better than ever (see below)

Try splitting notes and recombining them -- lots of intelligence going into this.

There's a big difference between taking a passage up a major third and taking it up 4 semitones.  In Chromatic contexts, music21 will now spell things how a musician would like to see them.  If you're working with MIDI data, without explicit enharmonics specified, you'll appreciate this.

Go ahead and use '~/../dir' and things like that in your file parsing -- music21 groks all that.

Look at you, fancy programmer, with your 4-CPU laptop! Why not give common.runParallel(tasks, function) a try and get music21 working 2-3x faster than before?  (just make sure that "tasks" isn't a list of Streams).  Oh, and if you're using search.segment to search for a particular passage inside a large collection of files, don't worry about using runParallel -- we'll do that for you automatically.

Docs are pretty and much better.  The User's Guide is the place to start.

Too many other changes to mention, but some shoutouts to Shimpe for new Lilypond code (triplet chords, etc.), Frank Zalkow for enharmonic spelling , dynamics, tempos and other things,  Chris Antilla for continued dedication to MEI processing, Emily Zhang for hashing functions and speeding up MIDI quantization,  Sonovice for articulation handling,  Dr. Schmidt for Chord symbol translations, Bagratte for IO cross Py2/3 fixes, Bo-Cheng Jhan for great braille contributions.


Big under the hood changes.

Big, backwards incompatible change: Many calls such as .parts, .notes, .getElementsByClass(), .getElementsByOffset(), etc. no longer return Streams.  They now are iterators (returning something called a StreamIterator).  For most uses, this is not going to change anything.  You can still use: for n in myStream.notes: and it'll work great. It makes many parts of music21 much, much faster.  For small scores, the differences will be small.  For large scores, the differences will be tremendous, especially when filters are chained, such as: myScore.recurse().notes.getElementsByClass('Chord').getElementsByOffset(0.0).  You're going to find that writing iterator chains is an amazing way to only get at items you want, especially with custom filters.  To get the old behavior, just add .stream() to the end of your iteration.

Because none of these filters change the activeSite of an element, you'll find that this is much more stable than before.

If you want to know what the note after a given note is in a musical context, call n.next() or n.previous().  If it's the last note of a measure, it'll move on the first note of the next. And once you've called .next() on one note of a stream, the remaining calls will be super super fast. I still haven't wrapped my mind completely around this paradigm, but it sure beats all the fooling around I used to do to figure out if one note was the same pitch as the next one, etc.

If you've been using music21 for some time, but have never looked at the docs for base.contextSites(), do it -- this is a very fast and extremely powerful way of figuring out how two objects relate to each other.  Together with the .next(), .previous(), and better support for .derivation, many extremely powerful systems can be written in music21 easily that could only be written with huge difficulty before.

Nearly all functions marked deprecated in v.2. have been removed.  Lots of super obscure functions in .base.Music21Object or .sites.Sites are gone.  This is a positive step since it'll make the documentation for these objects simple enough to understand.

Sorting works.  I mean, it just works.  With grace notes, with oddly positioned elements, with what have you. And it's pretty darn fast. This might seem like something small, but it's enormous for us.

Corpus managing is much simplified -- if you ever thought in the past, "Hey, I'd like to use a custom corpus" and then thought, "uhh...no thanks..." give a look at what is needed to set one up now.  You'll be glad for it!

Musescore and not Lilypond is used in Jupyter/IPython notebooks.

Complex durations are a lot less complex -- and faster.  

PyLint on all code -- I estimate that at least 200 undetected bugs vanished through this major effort. 


The Future

As noted in messages to the music21 mailing list (music21list at Google Groups), v.3.1 is the first non-beta release in the v.3 lineup.  Version 3 happens to share a version number with Python 3, but that is merely coincidence.  Music21 version 3 continues to work with Python 2.7 as well as Python 3.4.  Version 3 adds explicit support for Python 3.5 and drops support for Python 3.3. Music21 will continue to develop into a Version 4 to be released next summer (4.0.x will be alpha and beta releases and 4.1.0 will be the public release).  Version 4 will likely be the last version to support Python 2.

This release represents the end of a year's sabbatical where I got to work on low-low-level music21 functions that I didn't think anyone else would want to.  Due to teaching and other obligations, I'll be taking off work on the heart of music21 until the holidays (I'll still be taking bug fixes, etc.) and working more on documentation, examples, and applications.  The changes put in place for music21 v.3 has made working with it a lot more fun for me, so you'll probably see more a lot more applications get added, first to the alpha directory and then into the main set.  I've also put up a version 4 roadmap (trees are almost done, they should make it in. Style objects will be introduced so that beautiful musical scores can be created or at least imported and exported properly without major speed loses) so if anyone wants to take the lead on a project you can do so.  I'm working on a project called STAMR, Small Tools for Agile Music Research, which should create standalone tools using music21 and music21j for musicologists to get their work done faster.  Given that I'm teaching music fundamentals online again, you should see music21 and music21j integration working far better than ever before.

Thanks to MIT for supporting my work, and the Seaver Institute and the NEH for initial funding to make music21 a success.  And thanks to this great community for all your contributions in the past and contributions to come.



[August 3, 2016 20:09 pm] « » [music21]
A release candidate of version 3 of music21 (3.0.6) is available now as a package on GitHub.  As a beta release it's still not 100% ready for general usage, so "pip install music21" will still install version 2.

Get it at:
https://github.com/cuthbertLab/music21/releases 

This is likely to be the Release Candidate also for music21 v.3.  I've indicated several times in the past that music21 v.3 is the last release that I guarantee will continue to support Python 2.  I now suspect that there will be a Python 2.7-compatible v.4, since Python 2.7 is still the shipping version of Python w/ macOS Sierra.

Notes about the substantial changes in Version 3 have been posted on this list several times.  The biggest changes are that .getElementsByClass() and .notes, etc. all return a new class called a “StreamIterator” which makes working with stream filtering much much faster than before, especially for very large streams. 

The new tree-based storage system is still too flaky to turn on by default except in a few cases, and will be deferred to Music21 v.4.

Laundry list of items changed since 3.03-alpha:

min Python 3 version is now 3.4 -- 3.3 should still work but is untested.Notes parsed from MIDI or transposed by a number of semitones now get a .spellingIsInferred attribute which indicates that they can change their spelling ("G#" vs "A-") as needed for the situation. This is incompatible behavior, but much improved!
More User's Guide chapters...
Lots of improvements to Braille output, and refactoring to make more improvements in the future possible. (Thanks Bo-Cheng Jhan! and esp. to Jose Cabal-Ugaz in the first place).
Better documentation for Searching Lyrics (User's Guide, Chapter 28)
MIDI will now handle part names, etc. that use unicode.
Long files (>10 pages) in Musescore automatic PDF generation format now work (thanks Emily Zhang)
TimeSignatures such as 2/4+3/8 display much better now.
TimeSignatures import and export symbols ('common', 'cut', etc.) in MusicXML properly.
Fixes for musicxml parsing where both voices and chords interact.
configure.py now finds common MusicXML readers (Finale, Sibelius, MuseScore) on Windows!
InsertIntoNoteOrChord works in more cases
Streams are faster to calculate their own durations
Major speedups to large makeMeasure() calls.
A failure in makeTies now gives a warning instead of an exception
The old "musicxmlOld" format is removed. 
metronomeMarkBoundaries works better (thanks Frank Zalkow!)
improved ability to specify how to quantize MIDI (thanks EZ!)
improvements to automatic instrument detection.
lots more MEI articulations work (thanks sonovice)
Fingerings import and export to MusicXML better.
noteheadParentheses work well now.
Roman.fromChordAndKey gets quite a few more chords now.
incompatible change: normalOrder instead of normalForm for chords -- gives the untransposed ordering of a chord -- this is the correct behavior; prior behavior is wrong.
incompatible change: Stream.offsetMap is now a method
incompatible change: common.isStr() removed -- use isinstance(s, str) or (s, (str, unicode))
incompatible change: key.KeySignature() no longer supports a .mode attribute.  Use key.Key() instead.  key.KeySignature() objects get an .asKey('major') etc. attribute.
incompatible change: pitches lose ".getMidiPreCentShift()" method.  Just call .midi instead!

in case you missed what is new in v.3.0.3:

Python 3.5 compatible.
Splitting notes via .splitXXXX() is now much safer and more efficiently
Many, many parts of music21 can now use multiprocessing efficiently (see common.misc.runParallel())
BUG: Haydn Opus 74no2 was not actually ever in the corpus and has been removed.
Several MEI bugs fixed (thanks Chris Antilla)
Better automatic generation of chords from pitch classes (thanks Frank Z.)
Triplet chords work in lilypond
midi.realtime works with PyGame on Python3.
Music21Object.next() and .previous() work well and FAST 99.99% of the time.  (see docs for the tiny case when it might be a problem)
Many routines that used to return tuples now return NamedTuples
Many deprecated routines removed — code is simpler to use
Working with multiple corpora now has “only one way to do it” — simplify.
.show(‘midi’) works in IPython.

Much more documentation





[July 18, 2016 12:36 pm] « » [music21]
Despite the author name, this is a guest post from Christopher Witulski; he can be reached at  chris.witulski at gmail.com.  We thank him for sharing this exciting pre-publication work. -- MSC

Last year I learned about music21 and ever since I have been wondering how I can use it to learn more about the Moroccan musical repertoires that I study. Long story short, I ended up building a tool for creating interactive web-based contour visualizations from the command line and I'd like to share it here.

Climbing out of a rabbit hole

I was working through a project and struggling to keep track of things. The paper was an analysis of a genre of Moroccan sung poetry called malhun for the 2016 Analytical Approaches to World Music conference. The performance of each poem can last twenty-ish minutes and contains a number of repetitions of the refrain text. These refrains are short (roughly eight 2/4 measures long) and modulate through repetitive--but different--melodies. I had transcribed over sixty of them in an attempt to understand how they worked, how they changed, and how they were related to each other.

Musicians on stage, with a solo singer in front and the author among five violinists holding his in Western style
Malhun Performance in Fez, Morocco

Having performed this music in Morocco (I'm the lone violinist in the photo who can't quite figure out how to play while holding the instrument upright on my knee), I was constantly struck with this feeling of déjà vu. New melodies felt so similar to old ones, but I could not put my finger on how or why. The problem was simple: I could not keep sixty or seventy different transcriptions in my head at once. Comparing them was getting tricky. I wanted a way to stack them on top of each other, almost as if I could print the transcriptions on a transparency and show them all at once on an overhead projector.

Over the previous months, I had been teaching myself Python in an effort to learn more about music21 and what it could do. It was time to try and build the tool that I needed instead of wishing I could find it.

Visualizing contours

For the presentation, I put together a small library that carried out two main tasks. First, it used music21 to parse my transcriptions, normalize the length of each melody, and build a dataset. Using offsets, frequencies, and distances from the final note of each melody, it turned note objects into a JSON of coordinates. At 1,000 different y values (each corresponding with 1/1,000th of the length of the total melody), it measured an x value for the frequency and one for the "distance from the root," the distance in steps above or below the melody's final pitch.

Visualization of 68 Melodic Contours from Malhun overlaid with one labeled in bold red
Malhun Contour Example

The JSON was passed to another library that I had been recently learning and working with called D3.js. It is written in JavaScript and designed for creating powerful interactive data visualizations. I supplemented my presentation with an online chart of each of my malhun transcriptions: by grouping contour lines within each poem, I was able to easily see the source of my déjà vu. Despite changes in pitch content, range, root motion, and a host of other things, the contours themselves often stayed strikingly consistent throughout the long performances. You can see the visualization and click through the different poems online, though be aware that some parts (like the "Next" button) are artifacts of the paper presentation.

Building a tool

Maybe two weeks ago I decided to try my hand at creating a Python library of my own. I simplified the chart, creating a sort of template, remove the stepwise element of the visualization, and fought my way through learning to upload a project to PyPI. The result is ContourViz... I didn't give much thought to the name, my apologies.

"Three Melodic Contours" -- one is shown in blue.
ContourViz, simple example

The tool takes, as an argument, either a music notation file or a directory with many of them. It parses these files and creates a JSON structure of 1,000 coordinates for D3.js to work with. It then copies a folder called results that includes an index.html file and a folder of JavaScript and CSS files that the generated web page will use into the current directory. Finally, it runs the Python SimpleHTTPServer and opens the new page, parsing the JSON to create the visualization.

You can install ContourViz using the following in your terminal:

pip install contourviz

It runs from the command line, so creating a visualization of multiple melodies, like the one above, is as easy as:

chart-single-contour '/path/to/file.xml'

Working with a directory is similar:

chart-contours '/path/to/directory/full/of/xml/or/mxl/files'

6 Melodic Contours -- more complex chart showing 2 x three overlapping contours from Damlij-Bouzouba
ContourViz, more complex example
I'm still toying with the system and it has a number of issues. For example, I would love for it to parse voices as individual melodies if they are present. Instead, it only works with monophonic lines, meaning that each voice has to be in an individual file if you wanted to visualize voice leaning or other contrapuntal patterns. There are smaller issues: I still need to set up the Y axis to render note names properly.

Please feel free to check out the GitHub repo and suggest any other changes or ways in which it could be more helpful. This is my first go around at building a tool of this sort, so I am eager to hear if it is helpful and how it could be improved. And thank you for allowing me to join the community.



[July 12, 2016 18:08 pm] « » [music21]
The following is a guest post from Daniel McGillicuddy, alias Basso Ridiculoso.  He can be reached at daniel.mcg [at] gmail.com.   -- MSC

Hello all!

I am a gigging musician and bass player who has discovered music21, but, alas, I am certainly not a musicologist or academic.

I have seen many of the amazing examples that showcase music21’s capabilities with classical and twentieth-century music, and wanted to show how I use music21. Hopefully these examples show that music21 can also be used to explore jazz and popular music, either via analysis for educational purposes or for developing improvisational ideas.

Jazz Standard Voice Leading Lines

Music21 has an amazing corpus of public domain classical music, but most jazz standards are not available for inclusion. But, since music21 has an understanding of seventh chords and reads MusicXML, a virtual corpus of jazz standards is available for analysis and exploration via another application called IRealPro. IRealPro is a virtual accompanist software program that has chord charts for over 3000 jazz standards, and which can export the chord progressions in MusicXML, a format that will allow music21 to understand the harmony. Once we have that outline of a jazz standard's harmonic structure, music21 can be turned loose.

For this example, lets export the chord chart to the standard “Alone Together” and generate a 3rd to 7th voice-leading line through the entire tune, based on this concept by Burt Ligon, as described here.

(links: Alone Together.XML and Guide Tone Lines with Music21.py)

Since music21 understands harmony, any kind of voice leading line is possible, for instance the 5th resolving to the 9th. Now these voice leading lines can be generated for any jazz standard (or for any chord progression) that can be exported as MusicXML format and these lines can be used as jumping off points for making solos or studying voice leading.

Jazz Solo Analysis 

Analyzing jazz solos from the masters is another way to get improvisational material, but it is better known as stealing someones licks! Since music21 can understand the relationship of any note to any chord, it can be used to analyze the functional relationship of the notes in a solo.

Here is an example of Miles Davis’s solo on “Freddie Freeloader” with the notes being labeled so they represent their function against the chord being played, for example, an F note on a Bb7 chord being the fifth.

(links: Miles Solo XML  and Melodic Labeler.py)

This same Music21 code was used to analyze Charlie Parker's solo on Bloomdido, and a walking bass line over F blues by Ron Carter.

Now any solo line that can be exported as MusicXML can be analyzed by music21 and then explored even further. What notes are favored? What beats of the bar do certain notes get played on? How many times do certain notes get played? Are there repeating phrases that a certain player uses over and over? All of this can be cataloged or graphed once it has been brought into the music21 world. The included code needs a chord symbol over every measure.

Hopefully these examples show that music21 is not only for musicologists exploring the pitch class space of Bartok's string quartets or for twelve-tone row composers! Students and musicians can use it for very useful and practical purposes as well. Many thanks to Michael for allowing this guest posting from big music21 fan!

(Ed: Thanks Dan! The examples included here are copyrighted by their respective composers and publishers. We believe their inclusion here for educational and instructional purposes are supported by all four factors of the Fair Use test).
[October 2, 2015 12:37 pm] « » [music21]

MIT Spectrum has an article by Kathryn M. O'Neill on my work, music21, and computational musicology:
“IF I WANT TO KNOW how the guitar and saxophone became the important instruments throughout classical repertory or how chord progressions have changed, those are questions musicology has been unable to approach,” says Associate Professor of Music Michael Cuthbert. Spotting trends and patterns in a large corpus of music is nearly impossible using traditional methods of study, because it requires the slow process of examining pieces one by one. What his field needed, Cuthbert determined, was a way to “listen faster.”
Read more at http://spectrum.mit.edu/articles/data-in-a-major-key/.

In other news, Clifton Callender at Florida State University is currently teaching a doctoral seminar on music theory techniques using music21.  His course description is at http://cliftoncallender.com/teaching/.



How can I contribute?

Music21 is a rapidly-progressing project, but it is always looking for researchers interested in contributing code, questions, freely-distributable pieces, bug fixes, or documentation. Please contact Michael Scott Cuthbert (cuthbert at mit.edu), Principal Investigator.

The development of music21 has been supported by the School of Humanities, Arts, and Social Sciences at M.I.T., the Music and Theater Arts section, and generous grants from the Seaver Institute and the NEH/Digging-Into-Data Challenge. Further donations to the project are always welcome.