What is music21?
Music21 is a set of tools for helping scholars and other active listeners answer questions about music quickly and simply. If you’ve ever asked yourself a question like, “I wonder how often Bach does that” or “I wish I knew which band was the first to use these chords in this order,” or “I’ll bet we’d know more about Renaissance counterpoint (or Indian ragas or post-tonal pitch structures or the form of minuets) if I could write a program to automatically write more of them,” then music21 can help you with your work.
How simple is music21 to use?
Extremely. After starting Python and typing "from music21 import *" you can do all of these things with only a single line of music21 code:
Display a short melody in musical notation:
converter.parse("tinynotation: 3/4 c4 d8 f g16 a g f#").show()
Print the twelve-tone matrix for a
tone row (in this case the opening of Schoenberg's Fourth String Quartet):
print (serial.rowToMatrix([2,1,9,10,5,3,4,0,8,7,6,11]) )
Convert a file from Humdrum's **kern data format to MusicXML for editing in Finale or Sibelius:
With five lines of music21 code or less, you can:
Prepare a thematic (incipit) catalog of every Bach chorale that is in 3/4:
for workName in corpus.chorales.Iterator():
work = converter.parse(workName)
firstTS = work.recurse().getElementsByClass('TimeSignature')
if firstTS.ratioString == '6/8':
Google every motet in your database that includes the word ‘exultavit’ in the superius (soprano) part (even if broken up as multiple syllables in the source file) to see how common the motet's text is (assuming you have a bunch of motets in "listOfMotets"):
for motet in listOfMotets:
superius = motet
lyrics = text.assembleLyrics(part)
if 'exultavit' in lyrics:
webbrowser.open('http://www.google.com/search?&q=' + lyrics)
Add the German name (i.e., B♭ = B, B = H, A♯ = Ais) under each note of a Bach chorale and show the new score:
for thisNote in bwv295.recurse().notes:
Of course, you are never limited to just using five lines to do tasks with music21. In the demos folder of the music21 package and in the sample problems page (and throughout the documentation) you’ll find examples of more complicated problems that music21 is well-suited to solving, such as cataloging the rhythms of a piece from most to least-frequently used.
Music21 builds on preexisting frameworks and technologies such as Humdrum, MusicXML, MuseData, MIDI, and Lilypond but music21 uses an object-oriented skeleton that makes it easier to handle complex data. But at the same time music21 tries to keep its code clear and make reusing existing code simple. With music21 once you (or anyone else) has written a program to solve a problem, that program can easily become a module to be adapted or built upon to solve dozens of similar (but not identical) problems.
Interested in learning more?
- Get Started with music21
- Browse the music21 documentation
- Download music21 from GitHub
- Get our latest news and updates at the music21 blog
- Read the Frequently Asked Questions list
- Sign up for the music21list mailing list through Google Groups.
Download from https://github.com/cuthbertLab/music21/releases or from the terminal, type:
pip3 install --upgrade music21
(or without the "3" if you are using Python 2)
Version 4 is the last version of music21 that will support Python 2.7. If you run Version 4 on Python 2.7, you will see a warning that it's time to move up to the brilliance that is Python 3.6.
Upgrades for most users should work automatically by typing:
pip3 install --upgrade music21
or (Python 2)
pip install --upgrade music21
Or download from GitHub
A summary of the most important changes since v.2.2 are below:
Fun and easy to understand changes:
Lots more MusicXML support -- better support for time signatures, for grace notes, for spanners, for metadata, for...you name it! And the system is refactored in such a way as to make contributing missing features quite easy.
MIDI files play back in Jupyter/IPython contexts. Lots of improvements there for people who use MuseScore.
The User's Guide has become much more awesome, and you can play with all the good features there.
Many new ways to search scores -- LyricSearcher is fully polished and documented in the User's Guide. Carl Lian's search.serial is upgraded from alpha to a full release that will be easy to expand in the future -- want to know where certain motives are used or transformed? This will make it easy to do. And search.segment -- the old standby -- is even better than ever (see below)
Try splitting notes and recombining them -- lots of intelligence going into this.
There's a big difference between taking a passage up a major third and taking it up 4 semitones. In Chromatic contexts, music21 will now spell things how a musician would like to see them. If you're working with MIDI data, without explicit enharmonics specified, you'll appreciate this.
Go ahead and use '~/../dir' and things like that in your file parsing -- music21 groks all that.
Look at you, fancy programmer, with your 4-CPU laptop! Why not give common.runParallel(tasks, function) a try and get music21 working 2-3x faster than before? (just make sure that "tasks" isn't a list of Streams). Oh, and if you're using search.segment to search for a particular passage inside a large collection of files, don't worry about using runParallel -- we'll do that for you automatically.
Docs are pretty and much better. The User's Guide is the place to start.
Too many other changes to mention, but some shoutouts to Shimpe for new Lilypond code (triplet chords, etc.), Frank Zalkow for enharmonic spelling , dynamics, tempos and other things, Chris Antilla for continued dedication to MEI processing, Emily Zhang for hashing functions and speeding up MIDI quantization, Sonovice for articulation handling, Dr. Schmidt for Chord symbol translations, Bagratte for IO cross Py2/3 fixes, Bo-Cheng Jhan for great braille contributions.
Big under the hood changes.
Big, backwards incompatible change: Many calls such as .parts, .notes, .getElementsByClass(), .getElementsByOffset(), etc. no longer return Streams. They now are iterators (returning something called a StreamIterator). For most uses, this is not going to change anything. You can still use: for n in myStream.notes: and it'll work great. It makes many parts of music21 much, much faster. For small scores, the differences will be small. For large scores, the differences will be tremendous, especially when filters are chained, such as: myScore.recurse().notes.getElementsByClass('Chord').getElementsByOffset(0.0). You're going to find that writing iterator chains is an amazing way to only get at items you want, especially with custom filters. To get the old behavior, just add .stream() to the end of your iteration.
Because none of these filters change the activeSite of an element, you'll find that this is much more stable than before.
If you want to know what the note after a given note is in a musical context, call n.next() or n.previous(). If it's the last note of a measure, it'll move on the first note of the next. And once you've called .next() on one note of a stream, the remaining calls will be super super fast. I still haven't wrapped my mind completely around this paradigm, but it sure beats all the fooling around I used to do to figure out if one note was the same pitch as the next one, etc.
If you've been using music21 for some time, but have never looked at the docs for base.contextSites(), do it -- this is a very fast and extremely powerful way of figuring out how two objects relate to each other. Together with the .next(), .previous(), and better support for .derivation, many extremely powerful systems can be written in music21 easily that could only be written with huge difficulty before.
Nearly all functions marked deprecated in v.2. have been removed. Lots of super obscure functions in .base.Music21Object or .sites.Sites are gone. This is a positive step since it'll make the documentation for these objects simple enough to understand.
Sorting works. I mean, it just works. With grace notes, with oddly positioned elements, with what have you. And it's pretty darn fast. This might seem like something small, but it's enormous for us.
Corpus managing is much simplified -- if you ever thought in the past, "Hey, I'd like to use a custom corpus" and then thought, "uhh...no thanks..." give a look at what is needed to set one up now. You'll be glad for it!
Musescore and not Lilypond is used in Jupyter/IPython notebooks.
Complex durations are a lot less complex -- and faster.
PyLint on all code -- I estimate that at least 200 undetected bugs vanished through this major effort.
As noted in messages to the music21 mailing list (music21list at Google Groups), v.3.1 is the first non-beta release in the v.3 lineup. Version 3 happens to share a version number with Python 3, but that is merely coincidence. Music21 version 3 continues to work with Python 2.7 as well as Python 3.4. Version 3 adds explicit support for Python 3.5 and drops support for Python 3.3. Music21 will continue to develop into a Version 4 to be released next summer (4.0.x will be alpha and beta releases and 4.1.0 will be the public release). Version 4 will likely be the last version to support Python 2.
This release represents the end of a year's sabbatical where I got to work on low-low-level music21 functions that I didn't think anyone else would want to. Due to teaching and other obligations, I'll be taking off work on the heart of music21 until the holidays (I'll still be taking bug fixes, etc.) and working more on documentation, examples, and applications. The changes put in place for music21 v.3 has made working with it a lot more fun for me, so you'll probably see more a lot more applications get added, first to the alpha directory and then into the main set. I've also put up a version 4 roadmap (trees are almost done, they should make it in. Style objects will be introduced so that beautiful musical scores can be created or at least imported and exported properly without major speed loses) so if anyone wants to take the lead on a project you can do so. I'm working on a project called STAMR, Small Tools for Agile Music Research, which should create standalone tools using music21 and music21j for musicologists to get their work done faster. Given that I'm teaching music fundamentals online again, you should see music21 and music21j integration working far better than ever before.
Thanks to MIT for supporting my work, and the Seaver Institute and the NEH for initial funding to make music21 a success. And thanks to this great community for all your contributions in the past and contributions to come.
Get it at:
This is likely to be the Release Candidate also for music21 v.3. I've indicated several times in the past that music21 v.3 is the last release that I guarantee will continue to support Python 2. I now suspect that there will be a Python 2.7-compatible v.4, since Python 2.7 is still the shipping version of Python w/ macOS Sierra.
Notes about the substantial changes in Version 3 have been posted on this list several times. The biggest changes are that .getElementsByClass() and .notes, etc. all return a new class called a “StreamIterator” which makes working with stream filtering much much faster than before, especially for very large streams.
The new tree-based storage system is still too flaky to turn on by default except in a few cases, and will be deferred to Music21 v.4.
Laundry list of items changed since 3.03-alpha:
min Python 3 version is now 3.4 -- 3.3 should still work but is untested.Notes parsed from MIDI or transposed by a number of semitones now get a .spellingIsInferred attribute which indicates that they can change their spelling ("G#" vs "A-") as needed for the situation. This is incompatible behavior, but much improved!
More User's Guide chapters...
Lots of improvements to Braille output, and refactoring to make more improvements in the future possible. (Thanks Bo-Cheng Jhan! and esp. to Jose Cabal-Ugaz in the first place).
Better documentation for Searching Lyrics (User's Guide, Chapter 28)
MIDI will now handle part names, etc. that use unicode.
Long files (>10 pages) in Musescore automatic PDF generation format now work (thanks Emily Zhang)
TimeSignatures such as 2/4+3/8 display much better now.
TimeSignatures import and export symbols ('common', 'cut', etc.) in MusicXML properly.
Fixes for musicxml parsing where both voices and chords interact.
configure.py now finds common MusicXML readers (Finale, Sibelius, MuseScore) on Windows!
InsertIntoNoteOrChord works in more cases
Streams are faster to calculate their own durations
Major speedups to large makeMeasure() calls.
A failure in makeTies now gives a warning instead of an exception
The old "musicxmlOld" format is removed.
metronomeMarkBoundaries works better (thanks Frank Zalkow!)
improved ability to specify how to quantize MIDI (thanks EZ!)
improvements to automatic instrument detection.
lots more MEI articulations work (thanks sonovice)
Fingerings import and export to MusicXML better.
noteheadParentheses work well now.
Roman.fromChordAndKey gets quite a few more chords now.
incompatible change: normalOrder instead of normalForm for chords -- gives the untransposed ordering of a chord -- this is the correct behavior; prior behavior is wrong.
incompatible change: Stream.offsetMap is now a method
incompatible change: common.isStr() removed -- use isinstance(s, str) or (s, (str, unicode))
incompatible change: key.KeySignature() no longer supports a .mode attribute. Use key.Key() instead. key.KeySignature() objects get an .asKey('major') etc. attribute.
incompatible change: pitches lose ".getMidiPreCentShift()" method. Just call .midi instead!
in case you missed what is new in v.3.0.3:
Last year I learned about music21 and ever since I have been wondering how I can use it to learn more about the Moroccan musical repertoires that I study. Long story short, I ended up building a tool for creating interactive web-based contour visualizations from the command line and I'd like to share it here.
Climbing out of a rabbit hole
|Malhun Performance in Fez, Morocco|
|Malhun Contour Example|
Building a tool
|ContourViz, simple example|
|ContourViz, more complex example|
I am a gigging musician and bass player who has discovered music21, but, alas, I am certainly not a musicologist or academic.
I have seen many of the amazing examples that showcase music21’s capabilities with classical and twentieth-century music, and wanted to show how I use music21. Hopefully these examples show that music21 can also be used to explore jazz and popular music, either via analysis for educational purposes or for developing improvisational ideas.
Jazz Standard Voice Leading LinesMusic21 has an amazing corpus of public domain classical music, but most jazz standards are not available for inclusion. But, since music21 has an understanding of seventh chords and reads MusicXML, a virtual corpus of jazz standards is available for analysis and exploration via another application called IRealPro. IRealPro is a virtual accompanist software program that has chord charts for over 3000 jazz standards, and which can export the chord progressions in MusicXML, a format that will allow music21 to understand the harmony. Once we have that outline of a jazz standard's harmonic structure, music21 can be turned loose.
For this example, lets export the chord chart to the standard “Alone Together” and generate a 3rd to 7th voice-leading line through the entire tune, based on this concept by Burt Ligon, as described here.
Alone Together.XML and Guide Tone Lines with Music21.py)
Since music21 understands harmony, any kind of voice leading line is possible, for instance the 5th resolving to the 9th. Now these voice leading lines can be generated for any jazz standard (or for any chord progression) that can be exported as MusicXML format and these lines can be used as jumping off points for making solos or studying voice leading.
Jazz Solo AnalysisAnalyzing jazz solos from the masters is another way to get improvisational material, but it is better known as stealing someones licks! Since music21 can understand the relationship of any note to any chord, it can be used to analyze the functional relationship of the notes in a solo.
Here is an example of Miles Davis’s solo on “Freddie Freeloader” with the notes being labeled so they represent their function against the chord being played, for example, an F note on a Bb7 chord being the fifth.
Miles Solo XML and Melodic Labeler.py)
This same Music21 code was used to analyze Charlie Parker's solo on Bloomdido, and a walking bass line over F blues by Ron Carter.
Now any solo line that can be exported as MusicXML can be analyzed by music21 and then explored even further. What notes are favored? What beats of the bar do certain notes get played on? How many times do certain notes get played? Are there repeating phrases that a certain player uses over and over? All of this can be cataloged or graphed once it has been brought into the music21 world. The included code needs a chord symbol over every measure.
Hopefully these examples show that music21 is not only for musicologists exploring the pitch class space of Bartok's string quartets or for twelve-tone row composers! Students and musicians can use it for very useful and practical purposes as well. Many thanks to Michael for allowing this guest posting from big music21 fan!
(Ed: Thanks Dan! The examples included here are copyrighted by their respective composers and publishers. We believe their inclusion here for educational and instructional purposes are supported by all four factors of the Fair Use test).
MIT Spectrum has an article by Kathryn M. O'Neill on my work, music21, and computational musicology:
“IF I WANT TO KNOW how the guitar and saxophone became the important instruments throughout classical repertory or how chord progressions have changed, those are questions musicology has been unable to approach,” says Associate Professor of Music Michael Cuthbert. Spotting trends and patterns in a large corpus of music is nearly impossible using traditional methods of study, because it requires the slow process of examining pieces one by one. What his field needed, Cuthbert determined, was a way to “listen faster.”Read more at http://spectrum.mit.edu/articles/data-in-a-major-key/.
How can I contribute?
Music21 is a rapidly-progressing project, but it is always looking for researchers interested in contributing code, questions, freely-distributable pieces, bug fixes, or documentation. Please contact Michael Scott Cuthbert (cuthbert at mit.edu), Principal Investigator.
The development of music21 has been supported by the School of Humanities, Arts, and Social Sciences at M.I.T., the Music and Theater Arts section, and generous grants from the Seaver Institute and the NEH/Digging-Into-Data Challenge. Further donations to the project are always welcome.