Extract the musical key.
The key task recognizes the musical key and the tuning frequency of a song. The key describes its tonal center - the root note - and the scale such as major or minor. Typical key descriptions would be, for example, "a minor" or "E Major". The task is able to distinguish between major and minor keys and to identify the root/pitch class and thus has 24 possible results.
The musical key allows you to estimate the tonal compatibility between two or more songs, as it is done in, e.g., DJ software for the so-called harmonic mixing. The closer two keys are in the Circle of Fifths, the more likely they are compatible with each other. Pitch shifting (see task process/elastique) can be applied to the audio to increase compatibility.
||One of your personal access_ids.|
A file id or file url or a direct file upload (requires POST).
Upload a local file by performing a POST request with encoding-type "multipart/form-data" (corresponding to an html form with an <input type="file" />)
||Skip audio data before this position.|
||Ignore audio data after this position.|
[true, false] default: true
Whether to call the task in a blocking or non-blocking way.
Blocking calls return the results directly. For more info see the response documentation for this task.
[xml, json, jsonp, xmlp] default: xml
The format to use in the response.
Note that for the formats "jsonp" or "xmlp", the HTTP status code will always be 200.
||Callback function name required when using the format "jsonp" or "xmlp."|
Arbitrary string to add to the logs.
Allows you to add any strings you might find helpful for later analysis of the request logs.
The response gives you a status code, the file_id and the corresponding download URL of the resulting file (audio for process tasks and xml for analyze tasks). For processing reports, use the /file/status request with the parameter format=xml/json/jsonp/xmlp.
|status||The status code of the task.|
|file_id||The unique identifier of the file.|
|href||The direct download link to the file including the file_id|
The response gives you the main results (musical key, tuning frequency); these are complemented by a list of pitch energies indicating which pitches are more and less common in the analyzed song.
|key||string||The name of the extracted key.
The algorithm is able to detect both major (Maj) and minor (min) scales from 12 different root notes (C, C#, D, D#, E, F, F#, G, G#, A, A#, B). There are no flat root notes as we assume enharmonic equivalency (c sharp equals d flat).
|key_index||integer||A representation of the key as an index.
The key_index can take the values 0...23. The first 12 indices indicate the major keys, and remaining indices the minor keys, both starting from C (example: d# min is 15).
|tuning_frequency||float||The song's tuning frequency in Hz.
The tuning frequency is the frequency of the concert pitch A4 and frequently equals 440Hz. It may deviate a few Hertz for some songs.
|pitch_energies||List of average pitch energies extracted from the music.
Some pitch classes are more frequently used in a song than others. The pitch_energies attribute show the (octave-independent) energy distribution (also referred to as pitch chroma).
The pitch_energies contains the following attributes:
|name||string||The name of the pitch class.|
|energy||float||A value between 0 and 1 containing the energy of the pitch class compared to all other pitch classes.|
You can analyze the musical key of this
with a simple click on this button:
Analyze Learn more The button is just a link with a specially constructed URL:
&input_file=http://www.sonicapi.com/music/brown_eyes_by_ueberschall.mp3 By requesting this URL, the input file will be imported into the system, processed by sonicAPI and the analysis result is displayed in your browser. The Live-Demo generates additional example code.