[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

US20110252947A1 - Apparatus and method for classifying, displaying and selecting music files - Google Patents

Apparatus and method for classifying, displaying and selecting music files Download PDF

Info

Publication number
US20110252947A1
US20110252947A1 US13/079,362 US201113079362A US2011252947A1 US 20110252947 A1 US20110252947 A1 US 20110252947A1 US 201113079362 A US201113079362 A US 201113079362A US 2011252947 A1 US2011252947 A1 US 2011252947A1
Authority
US
United States
Prior art keywords
music
colour
attributes
vector
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US13/079,362
Other versions
US8686270B2 (en
Inventor
Jana Eggink
Franck Giron
Thomas Kemp
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Assigned to SONY CORPORATION reassignment SONY CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: EGGINK, JANA, GIRON, FRANCK, KEMP, THOMAS
Publication of US20110252947A1 publication Critical patent/US20110252947A1/en
Application granted granted Critical
Publication of US8686270B2 publication Critical patent/US8686270B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/0008Associated control or indicating means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/60Information retrieval; Database structures therefor; File system structures therefor of audio data
    • G06F16/68Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/60Information retrieval; Database structures therefor; File system structures therefor of audio data
    • G06F16/68Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/683Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2240/00Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
    • G10H2240/075Musical metadata derived from musical analysis or for use in electrophonic musical instruments
    • G10H2240/085Mood, i.e. generation, detection or selection of a particular emotional content or atmosphere in a musical piece
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2240/00Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
    • G10H2240/121Musical libraries, i.e. musical databases indexed by musical parameters, wavetables, indexing schemes using musical parameters, musical rule bases or knowledge bases, e.g. for automatic composing methods
    • G10H2240/131Library retrieval, i.e. searching a database or selecting a specific musical piece, segment, pattern, rule or parameter set
    • G10H2240/135Library retrieval index, i.e. using an indexing scheme to efficiently retrieve a music piece

Definitions

  • Embodiments of the invention relate to a method for classifying music files for identifying similar pieces of music and to methods for displaying and selecting classified music files.
  • Other embodiments refer to consumer electronic devices capable of classifying music files and to consumer electronic devices for displaying or selecting classified music files.
  • Consumer electronic devices with high storage capacity typically include means that assist the user in managing the music files, for example by automatically generating playlists.
  • the playlists may be generated in accordance with a presetting related to genre, mood, interpreter or others.
  • the object underlying the invention is to provide a method that assists the user in classifying music files, wherein the results of the classification are in closer accordance with the user's notion of similarity in perception.
  • Another object is a graphical interface for displaying classified music files and allowing a music file selection mechanism whose results meet user's expectations to a higher degree.
  • a further object is to provide consumer electronic devices allowing enhanced music file classification, display and selection.
  • FIG. 1 is a simplified block diagram illustrating a consumer electronic device in accordance with an embodiment referring to a calculator unit for calculating a mood related vector.
  • FIG. 2 is a simplified block diagram showing a consumer electronic device in accordance with another embodiment referring to handheld consumer electronic devices for display and selection of music files.
  • FIG. 3A is a schematic illustration of a display displaying a list for defining a first mood related vector in accordance with an embodiment referring to a method for classifying music files.
  • FIG. 3B is a schematic illustration of display domains assigned to music attributes in accordance with embodiments referring to methods for displaying and selecting music files.
  • FIG. 3C is a schematic illustration of a display displaying objects assigned to music files in accordance with the embodiment of FIG. 3B .
  • FIG. 4 is a simplified diagram of a colour plane for illustrating details of colour allocation to mood related vectors in accordance with the embodiment of FIG. 3C .
  • FIG. 5 is a simplified diagram of a colour plane for illustrating details of a selection mechanism in accordance with another embodiment referring to a method of selecting music files on the basis of the embodiment of FIG. 3B .
  • FIG. 6 is a simplified flow chart for illustrating a method of classifying music files in accordance with a further embodiment.
  • FIG. 1 refers to an apparatus 200 which is an electronic device such as a consumer electronic device.
  • the apparatus 200 may be a handheld device with a music playback function, for example a portable music player, a cellular phone, a digital personal assistant, or a stationary device, for example a home entertainment computer or an audio tuner with music record function.
  • the consumer electronic device may have an input port 202 for receiving a music file Dat or a storage unit 230 for locally storing music files or both.
  • the consumer electronic device may include an extraction unit 210 for determining parameters descriptive of the music contained in a music file Dat provided by the storage unit 230 or via the input port 202 .
  • the parameters may concern perception-related features like tempo, rhythm, dynamic range, instrumentation, beats per minute, time domain aspects, frequency domain aspects, and cepstral aspects, and not perception-related features like interpreter, year of release and language.
  • the consumer electronic device may receive the parameters assigned to a music file from another device. For example, the consumer electronic device may receive the parameters together with the respective music file from another consumer electronic device or a server.
  • the consumer electronic device may also have a further input port 204 for receiving information on selected music attributes (“channels”, “mood models”) Attr descriptive of a perceptual notion delivered by music represented by the music content of a music file.
  • the consumer electronic device may include a user entry unit 220 , with which a user may select at least two different music attributes.
  • a music attribute Attr may be represented by or may be derived from one single music parameter or music feature, for example tempo, dynamic range, mean loudness, or may combine two or more parameters in a way that they provide perceptual-oriented information about the music content of the respective music file, for example information related to genres, to which the music can be assigned, or information related to moods descriptive of the perceptual notion of the music.
  • the music attributes Attr may be derived from the complete music file contents or from a relevant part thereof.
  • some combinations of parameters or parameter ranges may be typical for a perception of a music content as more or less “extreme”, other combinations and parameter ranges may be typical for a perception as “relaxed”, “energetic”, or “upbeat”, or for a perception as belonging to a cross-genre class like “ballad”, electronic”, “acoustic”, “lounge”, “classical” or “music”.
  • Other music attributes may correspond to genres like “pop”, “rock”, “classic” or “jazz”.
  • the user entry unit 220 allows the selection of two or three different music attributes.
  • a calculator unit 110 calculates a first vector m 1 (Dat) that is descriptive of a degree of agreement between the music content of the target music file Dat and the selected music attributes Attr.
  • the calculator unit 110 outputs a classification information containing at least the first vector m 1 (Dat).
  • the classification information contains also additional information m 2 (Dat) descriptive of a degree of agreement between the music contents of the target music file Dat and another proper or improper subset of the music attributes.
  • the second vector m 2 (Dat) represents the best fitting music attribute and defines the affinity to a domain in a plane in which the music attributes are grouped.
  • the additional information m 2 (Dat) is a second, different mood-related vector determined by the calculator unit 110 and descriptive of the music represented by the music content of the same music file Dat.
  • the second vector m 2 (Dat) may define a position in a mood plane based on two different parameters or music attributes, for example one single parameter and one music attribute based on the evaluation of at least two parameters.
  • the two-dimensional mood plane may define a Cartesian or rotational coordinate system, by way of example.
  • the user may select a mood/tempo-space, wherein the position of a music file is defined by a first parameter describing a value on a sad-to-happy axis and a second, different parameter describing a value on a beat-per-minute axis.
  • the classification information may be displayed or may be transmitted to a further consumer electronic device. According to other embodiments, the classification information may be assigned to the music file Dat and stored as an attached attribute together with the respective music file Dat within the consumer electronic device, for example in the storage unit 230 . The classification information may be used for automatically generating playlists, or for finding music files perceptually similar with an identified piece of music, for example the music file most recently selected or played-back at the consumer electronic device or for selecting perceptually similar music files.
  • FIG. 2 refers to an apparatus 200 where the classification information is used for assisting the user in selecting music files out from a plurality of music files available in a database.
  • the apparatus 200 is a handheld consumer electronic device with a display 250 and a user entry unit 220 .
  • a control unit 150 of the consumer electronic device controls the transfer of music files stored in a storage unit 230 to an output unit 290 .
  • the control unit 150 may be or may include a processor.
  • the user entry unit 220 may include a manual input device with buttons and/or sensors or may be integrated in or combined with the display 250 , wherein the display 250 and the user entry unit 220 form a touchscreen capable of detecting and sensing the presence and location of a touch within a display area of the display 250 .
  • the user entry unit 220 may include a receiver unit for receiving electric signals from other electronic devices, for example an IR (infrared) receiver or an USB (universal serial bus) port.
  • the output unit 290 may be or include a loudspeaker, a headphone jack, an audio output port or a data output port configured to transmit music files.
  • the control unit 150 may transmit information to be displayed on the display 250 to a display control unit 280 that may be a graphical processor for controlling the display 250 to display the information provided by the control unit 150 .
  • the consumer electronic device may allow the user to select two or three channels for defining a first vector.
  • the user may operate the user entry unit 220 to request the control unit 150 to display a suitable selection menu on the display 250 .
  • the control unit 150 may cause the display control unit 280 to display a list of music attributes on the display 250 .
  • FIG. 3A shows a list 300 displayed on a display 250 and containing several entries 301 , wherein each entry 301 represents one of a plurality of predefined music attributes.
  • the list 300 contains music attributes referring to genres and music attributes referring to perceptually different cross-genre music classes like “extreme”, “energetic”, “music”, “upbeat”, “podcast”, “ballad” “electronic”, “acoustic”, “relax”, “lounge, “classical”, or others.
  • Each music attribute may be a combination of parameter values of certain music features.
  • the assignment of measurable parameter values to music attribute values may be based on the evaluation of the ratings of test music files by test persons and combining the evaluation results with the measurable parameters of the test music files.
  • the user may manipulate the user entry unit 220 to select a predetermined number of music attributes from the list 300 .
  • the predetermined number of selectable music attributes is two or three.
  • the user may select all two or three music attributes.
  • the user selects only one or two music attributes and the control unit 150 automatically chooses one or two further suitable music attributes.
  • the user selects one or two music attributes and the control unit 150 automatically chooses a suitable second or a suitable third music attribute such that for the user selected music attributes a neutral class can be identified, with reference to which at least two of the selected music attributes can be considered antithetic.
  • the user may assign a graphic attribute to each selected music attribute.
  • the graphic attribute may be the colour or the contour of an object assigned to the music file.
  • the control unit 150 automatically assigns the graphic attribute, for example a fundamental colour, to each selected music attribute.
  • the control unit 150 may output a classification information containing a first vector that is derived from the values of the selected music attributes.
  • the control unit 150 may also determine additional information, for example a best fitting music attribute or a second, different vector descriptive of the perceptual notion of the music file by the user.
  • the control unit 150 may enclose the additional information, for example the best fitting music attribute or the second vector, in the output classification information.
  • the best fitting music attribute may be used to determine the position of an object in a mood plane, where the music attributes are assigned to different domains grouped in accordance with the perceptual proximity of the respective music attributes. For example, on a display 250 a first domain 351 may be assigned to the music attribute “classical”, a second domain 352 to the attribute “lounge”, a third domain 353 to the attribute “relax”, a fourth domain 354 to the attribute “ballad”, a fifth domain 355 to the attribute “acoustic”, a sixth domain 356 to the attribute “extreme” a seventh domain 357 to the attribute “electronic”, an eighth domain 358 to the attribute “energetic”, and a ninth domain 359 to the attribute “upbeat”. Objects assigned to music files having “relax” as the best fitting music attribute are displayed in or next to the third domain 353 , objects assigned to music files with “classical” as best fitting music attribute are displayed in or near to the first domain 351 .
  • the consumer electronic device may assist the user in defining the second vector.
  • the user may operate the user entry unit 220 to request the control unit 150 to display a suitable selection menu for music features and attributes.
  • the selection menu may be a second list containing several entries, wherein each entry may represent a parameter like beats per minute or dynamic range, or a music attribute, for example a sad-happy index, a perceptually different cross-genre music class like “extreme”, “energetic”, “music”, “upbeat”, “podcast”, “ballad” “electronic”, “acoustic”, “relax”, “lounge, “classical”, or others.
  • the selected entries for the second vector may define a mood plane on the display.
  • the mood plane may be a Cartesian plane, where the axes denote parameters or metadata dimensions such as happy/sad or slow/fast, wherein a sad/happy parameter may be mapped onto the x-axis and a slow/fast parameter may be mapped on the y-axis.
  • the classification information m 1 (Dat), m 2 (Dat) may be used to control the visual representation of objects assigned to the music files on the display unit 150 .
  • the objects may be crosses, circles, points, squares or characters.
  • the first vector m 1 (Dat) may determine an appearance of at least a section of the object and the additional information or the second vector may determine the position of the object on the display unit 250 .
  • control unit 150 may map the second vector m 2 (Dat) into a plane, orthogonal Cartesian system with each of the two vector dimensions assigned to one of two orthogonal display axes.
  • the beats-per-minute axis of the mood-space may be mapped onto a y-axis of the display and the sad-to-happy axis may be mapped onto an x-axis, such that slow titles suggestive of sadness appear in the lower left quarter and fast titles suggestive of happiness appear in the upper right quarter.
  • the first vector m 1 (Dat) may determine the contour of the respective object.
  • the object contains colour information about at least the best fitting music attribute of the selected music attributes.
  • the object may have the colour of that music attribute that fits best with the first vector of the respective music files when a discrepancy between the first vector m 1 (Dat) and the music attribute does not exceed a predetermined threshold value.
  • An object may contain colour information about two close music attributes when a discrepancy between the first vector m 1 (Dat) and the best fitting music attribute exceeds the predetermined threshold.
  • the object may include two or more sub-areas, each sub-area having a colour assigned to one of the music attributes, wherein an area ratio of the sub-areas may correspond to an agreement ratio of the first vector m 1 (Dat) with the respective music attribute.
  • the colour of the object or at least a sub-area of the object is a combination colour that is mixed from the two fundamental colours assigned to the two closest music attributes when a discrepancy between the first vector and the closest music attribute exceeds the predetermined threshold.
  • FIG. 3C refers to an embodiment where the position of the objects result from the segmentation of a display 250 in domains 351 - 359 as described with reference to FIG. 3B and the best fitting music attribute for the displayed music files.
  • the selected music attributes are “extreme” assigned to the fundamental colour red, “ballad” assigned to the fundamental colour green, and “relax” assigned to the fundamental colour blue.
  • the frequency of red objects is high near the sixth domain 356 assigned to the attribute “extreme”
  • the frequency of green objects is high near the fourth domain 354 assigned to the attribute “ballad”
  • the frequency of blue objects is high near the third domain 353 assigned to the attribute “relax”.
  • the method allows identification of music files that, though placed close to a first domain, show stronger perceptual proximity to other music files placed close to other domains.
  • the object 399 identifies a music file that though its proximity to the sixth domain 356 representing extreme music contents, the user perceives as similar to the music files represented by objects 391 , 392 which are positioned close to other domains but which have a similar colour. Similar colours representing similar first vectors identify perceptually similar music content in a cross-genre manner.
  • the consumer electronic device may analyze each music file stored in the storage unit 230 to generate a score table for each music file.
  • the control unit 150 may analyze a music file in response to a user command or automatically, for example when the music file is stored in the storage unit 230 for the first time.
  • the consumer electronic device may receive the score table together with the music file or the score table may already be embedded in the music file.
  • the score table is determined on the basis of objective, measurable parameters descriptive of the music content of the concerned music file. It assigns a score value to each music attribute selected for the first vector, wherein the score value is a measure for the degree of agreement of the music contained in the music file with a subjective perception quality represented by the music attribute.
  • Table 1 shows an example for a score table assigned to a music file X:
  • x i score ⁇ ⁇ X , channel ⁇ ( i ) ⁇ - score ⁇ ⁇ X , channel ⁇ ( ⁇ ⁇ neutral ⁇ ⁇ class ′′ ) ⁇ score ⁇ ⁇ X , channel ⁇ ( ⁇ ⁇ neutral ⁇ ⁇ class ′′ ) ⁇ ( 1 )
  • the music attribute “music” is selected as neutral class.
  • channel( 0 ) extreme
  • channel( 1 ) ballad
  • channel( 2 ) relax
  • channel(i) ⁇ of table 1 equation ( 2 ) gives the relative score vector x i :
  • the best fitting channel for the music file X is “extreme”.
  • the channels “ballad” and “relax” follow in decreasing order.
  • the relative score values may be used for determining graphic properties of a displayed object assigned to the music file X.
  • the relative scores may be normalized with respect to the music files available in the database of interest, for example the whole or a selected part of that database, to which the music file X belongs.
  • the relative scores may be normalized with respect to the widest range of all reference relative channel scores available in the database using equation (3):
  • the database of interest may deliver the following values for max ⁇ i ⁇ and min ⁇ i ⁇ :
  • the first two values may then be used as vector norms in a two-dimensional colour plane.
  • the preceding normalization step allows utilizing the complete range of values for these channels which also results in a larger variation of the norms.
  • FIG. 4 illustrates details of colour assignment in accordance with an embodiment referring to the selected music attributes “extreme”, “relax”, and “ballad”.
  • three music attributes are assigned to three different colours and three different directions related to a point of origin 410 .
  • a first music attribute 401 is assigned to a first colour 421 and a first direction 441
  • a second music attribute 402 is assigned to a second colour 422 and a second direction 442 different from the first direction 441
  • a third music attribute 403 is assigned to a third colour 423 and a third direction 443 different from the first and the second direction.
  • the second direction 442 may be rotated by 120 degrees versus the first direction 441 and the third direction 443 may be rotated by ⁇ 120 degrees versus the first direction 441 .
  • the first, second and third colours may be the fundamental colours of the RGB (red, green, blue) colour system.
  • the three directions 441 , 442 , 443 span a hue system, wherein the direction of the first vector referred to the hue system defines the hue of an object assigned to the first vector.
  • the music attribute “extreme” is assigned to a hue of 0 degree (red)
  • the music attribute “ballad” is assigned to a hue of 120 degrees (green)
  • the music attribute “relax” is assigned to a hue of 240 degrees (blue).
  • the vector y i with the basic vector y 0 plotted along the 0 degree direction and with the basic vector y 1 plotted along the 120 degree direction gives a hue ⁇ of about 90 degrees.
  • the music file X is assigned to a combination of green and yellow.
  • Objects of music files that the user perceives as similar to the music file X are assigned to similar hues. Perceptual divergences are transformed into colour variations. When objects assigned to music files are displayed in colours corresponding to the evaluated hues, the user can easily determine the perceptual distance on the basis of the degree of colour deviation.
  • the saturation and lightness of an object may be set equal for all objects.
  • saturation and lightness can be used to indicate further music attributes or parameters.
  • the saturation may depend on the sad/happy parameter and/or the tempo.
  • the luminance is made directly proportional to a relative speed score of the models, wherein the relative score is derived from the absolute speed value related to the maximum speed range available in the database.
  • FIG. 5 refers to a method of selecting data files.
  • a colour selection object may be displayed on the display of a consumer device.
  • the colour selection object has sections of different colour.
  • the colour selection object includes the fundamental colours and secondary colours.
  • the colour selection object is a complete hue colour wheel, a complete RGB field or a section of a hue colour wheel or an RGB field.
  • the user may select a colour or colour field 510 to define an input hue or an input hue range ⁇ 1 to ⁇ 2 .
  • the consumer electronic device then may select such music files which first vectors result in hue values with the lowest degree of discrepancy to the input hue. For example, if the consumer electronic device is preset to generate playlists with a predetermined number of entries, for example ten, the consumer electronic device selects the ten music files with the closest hue values for the playlist. According to other embodiments, the consumer electronic device may generate a playlist of all music files which hues fall within the input hue range ⁇ 1 to ⁇ 2 .
  • a user may select, at an apparatus, for example at a consumer electronic device, a predetermined number of different music attributes, for example three music attributes, that are descriptive of a mood delivered by music represented by music content of a music file ( 602 ). From parameters of a music file available in the apparatus, the apparatus determines a first vector descriptive of a relationship between the target music file and the selected music attributes and a second vector descriptive of a relationship between the target music file and a proper or improper subset of the music attributes ( 604 ) and outputs a classification information related to the music file and containing at least the first and second vectors ( 606 ).
  • a predetermined number of different music attributes for example three music attributes

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Library & Information Science (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • User Interface Of Digital Computer (AREA)
  • Auxiliary Devices For Music (AREA)

Abstract

At an apparatus (200), which may be a consumer electronic device, the user may select a predetermined number of different music attributes (Attr) descriptive of a mood delivered by music represented by music content of a music file. From parameters of a music file available in the apparatus (200), the apparatus (200) determines a first vector (m1(Dat)) descriptive of a relationship between the music content and all selected music attributes (Attr) and a second vector (m2(Dat)) and outputs a classification information related to the music file and containing the first and second vectors (m1(Dat), m2(Dat)). The classification information may be used to assign a colour to a displayed object assigned to the music file. Perceptually similar music files may be displayed in similar colours.

Description

  • Embodiments of the invention relate to a method for classifying music files for identifying similar pieces of music and to methods for displaying and selecting classified music files. Other embodiments refer to consumer electronic devices capable of classifying music files and to consumer electronic devices for displaying or selecting classified music files.
  • Consumer electronic devices with high storage capacity typically include means that assist the user in managing the music files, for example by automatically generating playlists. The playlists may be generated in accordance with a presetting related to genre, mood, interpreter or others.
  • The object underlying the invention is to provide a method that assists the user in classifying music files, wherein the results of the classification are in closer accordance with the user's notion of similarity in perception. Another object is a graphical interface for displaying classified music files and allowing a music file selection mechanism whose results meet user's expectations to a higher degree. A further object is to provide consumer electronic devices allowing enhanced music file classification, display and selection. These objects are achieved with the subject matters of the independent claims. Further embodiments are specified in the corresponding dependent claims.
  • Details of the invention will become more apparent from the following description of embodiments in connection with the accompanying drawings. The features of the various embodiments may be combined unless they exclude each other.
  • FIG. 1 is a simplified block diagram illustrating a consumer electronic device in accordance with an embodiment referring to a calculator unit for calculating a mood related vector.
  • FIG. 2 is a simplified block diagram showing a consumer electronic device in accordance with another embodiment referring to handheld consumer electronic devices for display and selection of music files.
  • FIG. 3A is a schematic illustration of a display displaying a list for defining a first mood related vector in accordance with an embodiment referring to a method for classifying music files.
  • FIG. 3B is a schematic illustration of display domains assigned to music attributes in accordance with embodiments referring to methods for displaying and selecting music files.
  • FIG. 3C is a schematic illustration of a display displaying objects assigned to music files in accordance with the embodiment of FIG. 3B.
  • FIG. 4 is a simplified diagram of a colour plane for illustrating details of colour allocation to mood related vectors in accordance with the embodiment of FIG. 3C.
  • FIG. 5 is a simplified diagram of a colour plane for illustrating details of a selection mechanism in accordance with another embodiment referring to a method of selecting music files on the basis of the embodiment of FIG. 3B.
  • FIG. 6 is a simplified flow chart for illustrating a method of classifying music files in accordance with a further embodiment.
  • FIG. 1 refers to an apparatus 200 which is an electronic device such as a consumer electronic device. The apparatus 200 may be a handheld device with a music playback function, for example a portable music player, a cellular phone, a digital personal assistant, or a stationary device, for example a home entertainment computer or an audio tuner with music record function. The consumer electronic device may have an input port 202 for receiving a music file Dat or a storage unit 230 for locally storing music files or both.
  • The consumer electronic device may include an extraction unit 210 for determining parameters descriptive of the music contained in a music file Dat provided by the storage unit 230 or via the input port 202. The parameters may concern perception-related features like tempo, rhythm, dynamic range, instrumentation, beats per minute, time domain aspects, frequency domain aspects, and cepstral aspects, and not perception-related features like interpreter, year of release and language. In accordance with other embodiments, the consumer electronic device may receive the parameters assigned to a music file from another device. For example, the consumer electronic device may receive the parameters together with the respective music file from another consumer electronic device or a server.
  • The consumer electronic device may also have a further input port 204 for receiving information on selected music attributes (“channels”, “mood models”) Attr descriptive of a perceptual notion delivered by music represented by the music content of a music file. Alternatively or additionally the consumer electronic device may include a user entry unit 220, with which a user may select at least two different music attributes.
  • A music attribute Attr may be represented by or may be derived from one single music parameter or music feature, for example tempo, dynamic range, mean loudness, or may combine two or more parameters in a way that they provide perceptual-oriented information about the music content of the respective music file, for example information related to genres, to which the music can be assigned, or information related to moods descriptive of the perceptual notion of the music. The music attributes Attr may be derived from the complete music file contents or from a relevant part thereof.
  • For example, some combinations of parameters or parameter ranges may be typical for a perception of a music content as more or less “extreme”, other combinations and parameter ranges may be typical for a perception as “relaxed”, “energetic”, or “upbeat”, or for a perception as belonging to a cross-genre class like “ballad”, electronic”, “acoustic”, “lounge”, “classical” or “music”. Other music attributes may correspond to genres like “pop”, “rock”, “classic” or “jazz”. According to an embodiment, the user entry unit 220 allows the selection of two or three different music attributes.
  • On the basis of all or a subset of the parameters of a target music file Dat, a calculator unit 110 calculates a first vector m1(Dat) that is descriptive of a degree of agreement between the music content of the target music file Dat and the selected music attributes Attr. The calculator unit 110 outputs a classification information containing at least the first vector m1(Dat). The classification information contains also additional information m2(Dat) descriptive of a degree of agreement between the music contents of the target music file Dat and another proper or improper subset of the music attributes. For example, the second vector m2(Dat) represents the best fitting music attribute and defines the affinity to a domain in a plane in which the music attributes are grouped.
  • According to other embodiments, the additional information m2(Dat) is a second, different mood-related vector determined by the calculator unit 110 and descriptive of the music represented by the music content of the same music file Dat. The second vector m2(Dat) may define a position in a mood plane based on two different parameters or music attributes, for example one single parameter and one music attribute based on the evaluation of at least two parameters. The two-dimensional mood plane may define a Cartesian or rotational coordinate system, by way of example. According to an illustrative embodiment, the user may select a mood/tempo-space, wherein the position of a music file is defined by a first parameter describing a value on a sad-to-happy axis and a second, different parameter describing a value on a beat-per-minute axis.
  • The classification information may be displayed or may be transmitted to a further consumer electronic device. According to other embodiments, the classification information may be assigned to the music file Dat and stored as an attached attribute together with the respective music file Dat within the consumer electronic device, for example in the storage unit 230. The classification information may be used for automatically generating playlists, or for finding music files perceptually similar with an identified piece of music, for example the music file most recently selected or played-back at the consumer electronic device or for selecting perceptually similar music files.
  • FIG. 2 refers to an apparatus 200 where the classification information is used for assisting the user in selecting music files out from a plurality of music files available in a database. According to the illustrated embodiment, the apparatus 200 is a handheld consumer electronic device with a display 250 and a user entry unit 220.
  • In response to a user command input at the user entry unit 220, a control unit 150 of the consumer electronic device controls the transfer of music files stored in a storage unit 230 to an output unit 290. The control unit 150 may be or may include a processor. The user entry unit 220 may include a manual input device with buttons and/or sensors or may be integrated in or combined with the display 250, wherein the display 250 and the user entry unit 220 form a touchscreen capable of detecting and sensing the presence and location of a touch within a display area of the display 250. According to other embodiments, for example stationary consumer electronic devices, the user entry unit 220 may include a receiver unit for receiving electric signals from other electronic devices, for example an IR (infrared) receiver or an USB (universal serial bus) port. The output unit 290 may be or include a loudspeaker, a headphone jack, an audio output port or a data output port configured to transmit music files. The control unit 150 may transmit information to be displayed on the display 250 to a display control unit 280 that may be a graphical processor for controlling the display 250 to display the information provided by the control unit 150.
  • In accordance with an embodiment, the consumer electronic device may allow the user to select two or three channels for defining a first vector. When the user wishes to define his personal mood-space, he may operate the user entry unit 220 to request the control unit 150 to display a suitable selection menu on the display 250. For example, the control unit 150 may cause the display control unit 280 to display a list of music attributes on the display 250.
  • FIG. 3A shows a list 300 displayed on a display 250 and containing several entries 301, wherein each entry 301 represents one of a plurality of predefined music attributes. In accordance with an embodiment, the list 300 contains music attributes referring to genres and music attributes referring to perceptually different cross-genre music classes like “extreme”, “energetic”, “music”, “upbeat”, “podcast”, “ballad” “electronic”, “acoustic”, “relax”, “lounge, “classical”, or others. Each music attribute may be a combination of parameter values of certain music features. The assignment of measurable parameter values to music attribute values may be based on the evaluation of the ratings of test music files by test persons and combining the evaluation results with the measurable parameters of the test music files.
  • Referring again to FIG. 2, the user may manipulate the user entry unit 220 to select a predetermined number of music attributes from the list 300. For example, the predetermined number of selectable music attributes is two or three. According to an embodiment, the user may select all two or three music attributes. In accordance with another embodiment, the user selects only one or two music attributes and the control unit 150 automatically chooses one or two further suitable music attributes. In accordance with further embodiments, the user selects one or two music attributes and the control unit 150 automatically chooses a suitable second or a suitable third music attribute such that for the user selected music attributes a neutral class can be identified, with reference to which at least two of the selected music attributes can be considered antithetic.
  • In addition, the user may assign a graphic attribute to each selected music attribute. The graphic attribute may be the colour or the contour of an object assigned to the music file. According to another example, the control unit 150 automatically assigns the graphic attribute, for example a fundamental colour, to each selected music attribute. For each music file contained in a database stored in the storage unit 230, the control unit 150 may output a classification information containing a first vector that is derived from the values of the selected music attributes. According to an embodiment, the control unit 150 may also determine additional information, for example a best fitting music attribute or a second, different vector descriptive of the perceptual notion of the music file by the user. The control unit 150 may enclose the additional information, for example the best fitting music attribute or the second vector, in the output classification information.
  • Referring to FIG. 3B, the best fitting music attribute may be used to determine the position of an object in a mood plane, where the music attributes are assigned to different domains grouped in accordance with the perceptual proximity of the respective music attributes. For example, on a display 250 a first domain 351 may be assigned to the music attribute “classical”, a second domain 352 to the attribute “lounge”, a third domain 353 to the attribute “relax”, a fourth domain 354 to the attribute “ballad”, a fifth domain 355 to the attribute “acoustic”, a sixth domain 356 to the attribute “extreme” a seventh domain 357 to the attribute “electronic”, an eighth domain 358 to the attribute “energetic”, and a ninth domain 359 to the attribute “upbeat”. Objects assigned to music files having “relax” as the best fitting music attribute are displayed in or next to the third domain 353, objects assigned to music files with “classical” as best fitting music attribute are displayed in or near to the first domain 351.
  • In accordance with another embodiment the consumer electronic device may assist the user in defining the second vector. For example, the user may operate the user entry unit 220 to request the control unit 150 to display a suitable selection menu for music features and attributes. The selection menu may be a second list containing several entries, wherein each entry may represent a parameter like beats per minute or dynamic range, or a music attribute, for example a sad-happy index, a perceptually different cross-genre music class like “extreme”, “energetic”, “music”, “upbeat”, “podcast”, “ballad” “electronic”, “acoustic”, “relax”, “lounge, “classical”, or others. The selected entries for the second vector may define a mood plane on the display. The mood plane may be a Cartesian plane, where the axes denote parameters or metadata dimensions such as happy/sad or slow/fast, wherein a sad/happy parameter may be mapped onto the x-axis and a slow/fast parameter may be mapped on the y-axis.
  • Referring again to FIG. 2, in a display and/or selection mode of the consumer electronic device, the classification information m1(Dat), m2(Dat) may be used to control the visual representation of objects assigned to the music files on the display unit 150. By way of example, the objects may be crosses, circles, points, squares or characters. For each displayed object the first vector m1(Dat) may determine an appearance of at least a section of the object and the additional information or the second vector may determine the position of the object on the display unit 250.
  • In accordance with an embodiment, the control unit 150 may map the second vector m2(Dat) into a plane, orthogonal Cartesian system with each of the two vector dimensions assigned to one of two orthogonal display axes. For example, the beats-per-minute axis of the mood-space may be mapped onto a y-axis of the display and the sad-to-happy axis may be mapped onto an x-axis, such that slow titles suggestive of sadness appear in the lower left quarter and fast titles suggestive of happiness appear in the upper right quarter.
  • According to an embodiment, the first vector m1(Dat) may determine the contour of the respective object. In accordance with another embodiment, the object contains colour information about at least the best fitting music attribute of the selected music attributes. For example, the object may have the colour of that music attribute that fits best with the first vector of the respective music files when a discrepancy between the first vector m1(Dat) and the music attribute does not exceed a predetermined threshold value. An object may contain colour information about two close music attributes when a discrepancy between the first vector m1(Dat) and the best fitting music attribute exceeds the predetermined threshold. In accordance with an embodiment, the object may include two or more sub-areas, each sub-area having a colour assigned to one of the music attributes, wherein an area ratio of the sub-areas may correspond to an agreement ratio of the first vector m1(Dat) with the respective music attribute.
  • In accordance with another embodiment, the colour of the object or at least a sub-area of the object is a combination colour that is mixed from the two fundamental colours assigned to the two closest music attributes when a discrepancy between the first vector and the closest music attribute exceeds the predetermined threshold.
  • FIG. 3C refers to an embodiment where the position of the objects result from the segmentation of a display 250 in domains 351-359 as described with reference to FIG. 3B and the best fitting music attribute for the displayed music files. In the illustrative example, the selected music attributes are “extreme” assigned to the fundamental colour red, “ballad” assigned to the fundamental colour green, and “relax” assigned to the fundamental colour blue. In substance, the frequency of red objects is high near the sixth domain 356 assigned to the attribute “extreme”, the frequency of green objects is high near the fourth domain 354 assigned to the attribute “ballad”, and the frequency of blue objects is high near the third domain 353 assigned to the attribute “relax”.
  • However, the method allows identification of music files that, though placed close to a first domain, show stronger perceptual proximity to other music files placed close to other domains. For example, the object 399 identifies a music file that though its proximity to the sixth domain 356 representing extreme music contents, the user perceives as similar to the music files represented by objects 391, 392 which are positioned close to other domains but which have a similar colour. Similar colours representing similar first vectors identify perceptually similar music content in a cross-genre manner.
  • Referring again to FIG. 2, the consumer electronic device may analyze each music file stored in the storage unit 230 to generate a score table for each music file. The control unit 150 may analyze a music file in response to a user command or automatically, for example when the music file is stored in the storage unit 230 for the first time. In accordance with other embodiments, the consumer electronic device may receive the score table together with the music file or the score table may already be embedded in the music file.
  • The score table is determined on the basis of objective, measurable parameters descriptive of the music content of the concerned music file. It assigns a score value to each music attribute selected for the first vector, wherein the score value is a measure for the degree of agreement of the music contained in the music file with a subjective perception quality represented by the music attribute. Table 1 shows an example for a score table assigned to a music file X:
  • TABLE 1
    Channel Score
    Extreme 5.65
    Energetic 6.48
    Music 7.03
    Upbeat 7.44
    Podcast 9.44
    Ballad 10.27
    Electronic 10.33
    Acoustic 10.96
    Relax 12.44
    Lounge 12.51
    Classical 12.79
  • For three selected music attributes relative scores may be evaluated with reference to a neutral class using equation (1):
  • x i = score { X , channel ( i ) } - score { X , channel ( `` neutral class ) } score { X , channel ( `` neutral class ) } ( 1 )
  • According to the illustrated embodiment the music attribute “music” is selected as neutral class. With channel(0)=extreme, channel(1)=ballad, channel(2)=relax and the absolute scores {X, channel(i)} of table 1, equation (2) gives the relative score vector xi:

  • x i =[x 0(extreme);x 1(ballad);x 2(relax)]=[0.196;−0.460;−0.768]  (2)
  • With reference to the neutral class “music”, the best fitting channel for the music file X is “extreme”. The channels “ballad” and “relax” follow in decreasing order.
  • According to an embodiment, the relative score values may be used for determining graphic properties of a displayed object assigned to the music file X. In accordance with another embodiment, the relative scores may be normalized with respect to the music files available in the database of interest, for example the whole or a selected part of that database, to which the music file X belongs. For example, the relative scores may be normalized with respect to the widest range of all reference relative channel scores available in the database using equation (3):
  • y i = x i - min ( x ) max { i } - min { i } ( 3 )
  • wherein yi is the normalized relative score. For the illustrative example, the database of interest may deliver the following values for max {i} and min {i}:
  • TABLE 2
    max(extreme) 0.412
    min(extreme) −7.013
    max(ballad) 0.228
    min(ballad) 0.611
    max(relax) 1.394
    min(relax) −0.852
  • With min(x) equal to −0.768, the resulting normalized vector becomes:

  • y i =[y 0(extreme);y 1(ballad);y 2(relax)]=[0.129;−0.365;0]  (4)
  • The first two values may then be used as vector norms in a two-dimensional colour plane. The preceding normalization step allows utilizing the complete range of values for these channels which also results in a larger variation of the norms.
  • FIG. 4 illustrates details of colour assignment in accordance with an embodiment referring to the selected music attributes “extreme”, “relax”, and “ballad”. In a colour plane, three music attributes are assigned to three different colours and three different directions related to a point of origin 410. According to an embodiment, a first music attribute 401 is assigned to a first colour 421 and a first direction 441, a second music attribute 402 is assigned to a second colour 422 and a second direction 442 different from the first direction 441, and a third music attribute 403 is assigned to a third colour 423 and a third direction 443 different from the first and the second direction. The second direction 442 may be rotated by 120 degrees versus the first direction 441 and the third direction 443 may be rotated by −120 degrees versus the first direction 441. The first, second and third colours may be the fundamental colours of the RGB (red, green, blue) colour system. According to an embodiment, the three directions 441, 442, 443 span a hue system, wherein the direction of the first vector referred to the hue system defines the hue of an object assigned to the first vector.
  • According to the illustrative example, the music attribute “extreme” is assigned to a hue of 0 degree (red), the music attribute “ballad” is assigned to a hue of 120 degrees (green) and the music attribute “relax” is assigned to a hue of 240 degrees (blue). The vector yi with the basic vector y0 plotted along the 0 degree direction and with the basic vector y1 plotted along the 120 degree direction gives a hue Φ of about 90 degrees. As a result, in a system, where the music attributes “extreme”, “relax” and “ballad” are assigned to the hues for red, green and blue respectively, the music file X is assigned to a combination of green and yellow.
  • Objects of music files that the user perceives as similar to the music file X are assigned to similar hues. Perceptual divergences are transformed into colour variations. When objects assigned to music files are displayed in colours corresponding to the evaluated hues, the user can easily determine the perceptual distance on the basis of the degree of colour deviation.
  • The saturation and lightness of an object may be set equal for all objects. In accordance with other embodiments, saturation and lightness can be used to indicate further music attributes or parameters. For example, the saturation may depend on the sad/happy parameter and/or the tempo. According to an embodiment, the luminance is made directly proportional to a relative speed score of the models, wherein the relative score is derived from the absolute speed value related to the maximum speed range available in the database.
  • FIG. 5 refers to a method of selecting data files. A colour selection object may be displayed on the display of a consumer device. The colour selection object has sections of different colour. According to an embodiment, the colour selection object includes the fundamental colours and secondary colours. For example the colour selection object is a complete hue colour wheel, a complete RGB field or a section of a hue colour wheel or an RGB field. The user may select a colour or colour field 510 to define an input hue or an input hue range Φ1 to Φ2.
  • The consumer electronic device then may select such music files which first vectors result in hue values with the lowest degree of discrepancy to the input hue. For example, if the consumer electronic device is preset to generate playlists with a predetermined number of entries, for example ten, the consumer electronic device selects the ten music files with the closest hue values for the playlist. According to other embodiments, the consumer electronic device may generate a playlist of all music files which hues fall within the input hue range Φ1 to Φ2.
  • According to the method illustrated in the flow chart of FIG. 6, a user may select, at an apparatus, for example at a consumer electronic device, a predetermined number of different music attributes, for example three music attributes, that are descriptive of a mood delivered by music represented by music content of a music file (602). From parameters of a music file available in the apparatus, the apparatus determines a first vector descriptive of a relationship between the target music file and the selected music attributes and a second vector descriptive of a relationship between the target music file and a proper or improper subset of the music attributes (604) and outputs a classification information related to the music file and containing at least the first and second vectors (606).

Claims (15)

1. A method of operating an apparatus evaluating music files, the method comprising:
selecting, at the apparatus, a predetermined number of different music attributes descriptive of a mood of a piece of music included in a music file;
determining, from parameters of a target music file available in the apparatus, a first vector descriptive of a relationship between the target music file and the selected music attributes and a second vector descriptive of a relationship between the target music file and a subset of the music attributes; and
outputting classification information containing the first and second vectors.
2. The method of claim 1, further comprising
displaying an object on a display in dependence on the classification information.
3. The method of claim 2, wherein
the first vector determines a graphic attribute and the second vector determines a position of the displayed object.
4. The method of claim 1, wherein
similar first vectors identify perceptually similar music content in at least two different genres.
5. The method of claim 1, wherein
the predetermined number of different music attributes is three.
6. The method of claim 5, wherein
each selected music attribute is assigned to another fundamental colour and the first vector determines the colour of the displayed object.
7. The method of claim 6, wherein
the object includes colour information related to two selected music attributes.
8. The method of claim 6, wherein
the object includes an area having a colour that is a combination colour mixed from the two fundamental colours assigned to the two closest selected music attributes.
9. The method of claim 1, further comprising
sensing a user selection of one or a subset of the displayed objects.
10. An electronic device including
a user entry unit configured to be operated by a user, wherein a user can select a predetermined number of different music attributes descriptive of a mood of a piece of music contained in a music file;
a calculator unit configured to determine, from parameters of a target music file available in the electronic device, a first vector descriptive of a relationship between the target music file and all selected music attributes and a second vector descriptive of a relationship between the target music file and a subset of the music attributes, the calculator unit further being configured to output classification information containing the first and second vectors.
11. The electronic device of claim 10, further comprising
a display for displaying objects; and
a display control unit configured to control the display to display objects assigned to a plurality of music files in dependence on their classification information, wherein for each displayed object,
the second vector determines the position of the object, and
the first vector determines a colour of at least a sub-area of the object.
12. The electronic device of claim 11, wherein
the object includes colour information related to at least two selected music attributes.
13. The electronic device of claim 11, wherein
the object includes at least a sub-area having a colour that is a combination colour mixed from the two colours assigned to the two closest music attributes.
14. The electronic device of claim 10, wherein
the user entry unit is further configured to sense a user selection of one or more of the displayed objects, a colour or a colour range.
15. The electronic device of claim 10, wherein
the predetermined number of different music attributes is three.
US13/079,362 2010-04-16 2011-04-04 Apparatus and method for classifying, displaying and selecting music files Active 2032-04-16 US8686270B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
EP10004073.2 2010-04-16
EP10004073 2010-04-16
EP10004073 2010-04-16

Publications (2)

Publication Number Publication Date
US20110252947A1 true US20110252947A1 (en) 2011-10-20
US8686270B2 US8686270B2 (en) 2014-04-01

Family

ID=44787133

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/079,362 Active 2032-04-16 US8686270B2 (en) 2010-04-16 2011-04-04 Apparatus and method for classifying, displaying and selecting music files

Country Status (1)

Country Link
US (1) US8686270B2 (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090281906A1 (en) * 2008-05-07 2009-11-12 Microsoft Corporation Music Recommendation using Emotional Allocation Modeling
US20090277322A1 (en) * 2008-05-07 2009-11-12 Microsoft Corporation Scalable Music Recommendation by Search
US20120226706A1 (en) * 2011-03-03 2012-09-06 Samsung Electronics Co. Ltd. System, apparatus and method for sorting music files based on moods
US20140074861A1 (en) * 2012-09-07 2014-03-13 Eric Kenson Bieschke System and Method for Combining Inputs to Generate and Modify Playlists
US8686270B2 (en) * 2010-04-16 2014-04-01 Sony Corporation Apparatus and method for classifying, displaying and selecting music files
US20150000505A1 (en) * 2013-05-28 2015-01-01 Aalto-Korkeakoulusäätiö Techniques for analyzing parameters of a musical performance
US20150013525A1 (en) * 2013-07-09 2015-01-15 Miselu Inc. Music User Interface Sensor
US20150052436A1 (en) * 2012-01-06 2015-02-19 Gracenote, Inc. User interface to media files
WO2015110823A1 (en) * 2014-01-24 2015-07-30 British Broadcasting Corporation Processing audio data to produce metadata
US20150220633A1 (en) * 2013-03-14 2015-08-06 Aperture Investments, Llc Music selection and organization using rhythm, texture and pitch
US9335818B2 (en) 2013-03-15 2016-05-10 Pandora Media System and method of personalizing playlists using memory-based collaborative filtering
US9639871B2 (en) 2013-03-14 2017-05-02 Apperture Investments, Llc Methods and apparatuses for assigning moods to content and searching for moods to select content
US9875304B2 (en) 2013-03-14 2018-01-23 Aperture Investments, Llc Music selection and organization using audio fingerprints
US10061476B2 (en) 2013-03-14 2018-08-28 Aperture Investments, Llc Systems and methods for identifying, searching, organizing, selecting and distributing content based on mood
US10225328B2 (en) 2013-03-14 2019-03-05 Aperture Investments, Llc Music selection and organization using audio fingerprints
US20200013380A1 (en) * 2018-07-09 2020-01-09 Tree Goat Media, LLC Systems and methods for transforming digital audio content into visual topic-based segments
US10623480B2 (en) 2013-03-14 2020-04-14 Aperture Investments, Llc Music categorization using rhythm, texture and pitch
KR102274842B1 (en) * 2020-01-31 2021-07-08 견두헌 Apparatus for classifying music genres that express genres by color mixing and method thereof
US11271993B2 (en) 2013-03-14 2022-03-08 Aperture Investments, Llc Streaming music categorization using rhythm, texture and pitch
WO2022265132A1 (en) * 2021-06-17 2022-12-22 견두헌 Music genre classification apparatus and method for expressing genre by using color mixing method
US11609948B2 (en) 2014-03-27 2023-03-21 Aperture Investments, Llc Music streaming, playlist creation and streaming architecture
US12148408B2 (en) * 2023-09-05 2024-11-19 Tree Goat Media, Inc. Systems and methods for transforming digital audio content into visual topic-based segments

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102077228B1 (en) 2015-09-03 2020-04-07 삼성전자주식회사 Electronic device and Method for controlling the electronic device thereof

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007172523A (en) * 2005-12-26 2007-07-05 Sony Corp Information processor, information processing method, and program
US20070204227A1 (en) * 2006-02-24 2007-08-30 Kretz Hans M Graphical playlist
US20080294277A1 (en) * 1999-06-28 2008-11-27 Musicip Corporation System and Method for Shuffling a Playlist
US20100328312A1 (en) * 2006-10-20 2010-12-30 Justin Donaldson Personal music recommendation mapping
US20110087665A1 (en) * 2009-10-12 2011-04-14 Weare Christopher B Client playlist generation
US20120023403A1 (en) * 2010-07-21 2012-01-26 Tilman Herberger System and method for dynamic generation of individualized playlists according to user selection of musical features
US20120096026A1 (en) * 2009-07-14 2012-04-19 Katsu Saito Content recommendation system, content recommendation method, content recommendation device, and information storage medium
US20120101966A1 (en) * 2010-10-21 2012-04-26 Bart Van Coppenolle Method and apparatus for neuropsychological modeling of human experience and purchasing behavior
US8204837B2 (en) * 2006-01-06 2012-06-19 Sony Corporation Information processing apparatus and method, and program for providing information suitable for a predetermined mood of a user
US20120254806A1 (en) * 2011-03-30 2012-10-04 Google Inc. System and method for dynamic, feature-based playlist generation
US20120278831A1 (en) * 2011-04-27 2012-11-01 Van Coppenolle Bart P E Method and apparatus for collaborative upload of content
US20120290621A1 (en) * 2011-05-09 2012-11-15 Heitz Iii Geremy A Generating a playlist
US20130024547A1 (en) * 2011-07-21 2013-01-24 Katsu Saito Information processing apparatus, information processing system, information processing method, and program
US20130080565A1 (en) * 2011-09-28 2013-03-28 Bart P.E. van Coppenolle Method and apparatus for collaborative upload of content

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100643308B1 (en) 2005-07-11 2006-11-10 삼성전자주식회사 Apparatus and method for providing function to search for music file
KR100717387B1 (en) 2006-01-26 2007-05-11 삼성전자주식회사 Method and apparatus for searching similar music
US8686270B2 (en) * 2010-04-16 2014-04-01 Sony Corporation Apparatus and method for classifying, displaying and selecting music files

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080294277A1 (en) * 1999-06-28 2008-11-27 Musicip Corporation System and Method for Shuffling a Playlist
JP2007172523A (en) * 2005-12-26 2007-07-05 Sony Corp Information processor, information processing method, and program
US8204837B2 (en) * 2006-01-06 2012-06-19 Sony Corporation Information processing apparatus and method, and program for providing information suitable for a predetermined mood of a user
US20070204227A1 (en) * 2006-02-24 2007-08-30 Kretz Hans M Graphical playlist
US20100328312A1 (en) * 2006-10-20 2010-12-30 Justin Donaldson Personal music recommendation mapping
US20120096026A1 (en) * 2009-07-14 2012-04-19 Katsu Saito Content recommendation system, content recommendation method, content recommendation device, and information storage medium
US20110087665A1 (en) * 2009-10-12 2011-04-14 Weare Christopher B Client playlist generation
US20120023403A1 (en) * 2010-07-21 2012-01-26 Tilman Herberger System and method for dynamic generation of individualized playlists according to user selection of musical features
US20120101966A1 (en) * 2010-10-21 2012-04-26 Bart Van Coppenolle Method and apparatus for neuropsychological modeling of human experience and purchasing behavior
US20130086698A1 (en) * 2010-10-21 2013-04-04 Bart P.E. van Coppenolle Method and apparatus for distributed upload of content
US20120254806A1 (en) * 2011-03-30 2012-10-04 Google Inc. System and method for dynamic, feature-based playlist generation
US20120278831A1 (en) * 2011-04-27 2012-11-01 Van Coppenolle Bart P E Method and apparatus for collaborative upload of content
US20120290621A1 (en) * 2011-05-09 2012-11-15 Heitz Iii Geremy A Generating a playlist
US20130024547A1 (en) * 2011-07-21 2013-01-24 Katsu Saito Information processing apparatus, information processing system, information processing method, and program
US20130080565A1 (en) * 2011-09-28 2013-03-28 Bart P.E. van Coppenolle Method and apparatus for collaborative upload of content

Cited By (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090281906A1 (en) * 2008-05-07 2009-11-12 Microsoft Corporation Music Recommendation using Emotional Allocation Modeling
US20090277322A1 (en) * 2008-05-07 2009-11-12 Microsoft Corporation Scalable Music Recommendation by Search
US8344233B2 (en) 2008-05-07 2013-01-01 Microsoft Corporation Scalable music recommendation by search
US8438168B2 (en) 2008-05-07 2013-05-07 Microsoft Corporation Scalable music recommendation by search
US8650094B2 (en) * 2008-05-07 2014-02-11 Microsoft Corporation Music recommendation using emotional allocation modeling
US8686270B2 (en) * 2010-04-16 2014-04-01 Sony Corporation Apparatus and method for classifying, displaying and selecting music files
US20120226706A1 (en) * 2011-03-03 2012-09-06 Samsung Electronics Co. Ltd. System, apparatus and method for sorting music files based on moods
US9891796B2 (en) * 2012-01-06 2018-02-13 Gracenote, Inc. User interface to media files
US20150052436A1 (en) * 2012-01-06 2015-02-19 Gracenote, Inc. User interface to media files
US20140074861A1 (en) * 2012-09-07 2014-03-13 Eric Kenson Bieschke System and Method for Combining Inputs to Generate and Modify Playlists
JP2016500828A (en) * 2012-09-07 2016-01-14 パンドラ メディアPandora Media System and method for synthesizing inputs to generate and modify playlists
AU2013312361B2 (en) * 2012-09-07 2017-03-30 Pandora Media System and method for combining inputs to generate and modify playlists
US9367587B2 (en) * 2012-09-07 2016-06-14 Pandora Media System and method for combining inputs to generate and modify playlists
US11271993B2 (en) 2013-03-14 2022-03-08 Aperture Investments, Llc Streaming music categorization using rhythm, texture and pitch
US20150220633A1 (en) * 2013-03-14 2015-08-06 Aperture Investments, Llc Music selection and organization using rhythm, texture and pitch
US10623480B2 (en) 2013-03-14 2020-04-14 Aperture Investments, Llc Music categorization using rhythm, texture and pitch
US9639871B2 (en) 2013-03-14 2017-05-02 Apperture Investments, Llc Methods and apparatuses for assigning moods to content and searching for moods to select content
US9875304B2 (en) 2013-03-14 2018-01-23 Aperture Investments, Llc Music selection and organization using audio fingerprints
US10061476B2 (en) 2013-03-14 2018-08-28 Aperture Investments, Llc Systems and methods for identifying, searching, organizing, selecting and distributing content based on mood
US10225328B2 (en) 2013-03-14 2019-03-05 Aperture Investments, Llc Music selection and organization using audio fingerprints
US10242097B2 (en) * 2013-03-14 2019-03-26 Aperture Investments, Llc Music selection and organization using rhythm, texture and pitch
US9335818B2 (en) 2013-03-15 2016-05-10 Pandora Media System and method of personalizing playlists using memory-based collaborative filtering
US11204958B2 (en) 2013-03-15 2021-12-21 Pandora Media, Llc System and method of personalizing playlists using memory-based collaborative filtering
US10540396B2 (en) 2013-03-15 2020-01-21 Pandora Media, Llc System and method of personalizing playlists using memory-based collaborative filtering
US9040799B2 (en) * 2013-05-28 2015-05-26 Aalto-Korkeakoulusäätiö Techniques for analyzing parameters of a musical performance
US20150000505A1 (en) * 2013-05-28 2015-01-01 Aalto-Korkeakoulusäätiö Techniques for analyzing parameters of a musical performance
US20150013525A1 (en) * 2013-07-09 2015-01-15 Miselu Inc. Music User Interface Sensor
WO2015110823A1 (en) * 2014-01-24 2015-07-30 British Broadcasting Corporation Processing audio data to produce metadata
US11609948B2 (en) 2014-03-27 2023-03-21 Aperture Investments, Llc Music streaming, playlist creation and streaming architecture
US11899713B2 (en) 2014-03-27 2024-02-13 Aperture Investments, Llc Music streaming, playlist creation and streaming architecture
US10971121B2 (en) * 2018-07-09 2021-04-06 Tree Goat Media, Inc. Systems and methods for transforming digital audio content into visual topic-based segments
US20210166666A1 (en) * 2018-07-09 2021-06-03 Tree Goat Media, LLC Systems and methods for transforming digitial audio content into visual topic-based segments
US20200013380A1 (en) * 2018-07-09 2020-01-09 Tree Goat Media, LLC Systems and methods for transforming digital audio content into visual topic-based segments
US11749241B2 (en) * 2018-07-09 2023-09-05 Tree Goat Media, Inc. Systems and methods for transforming digitial audio content into visual topic-based segments
US20240177695A1 (en) * 2018-07-09 2024-05-30 Tree Goat Media, Inc. Systems and methods for transforming digital audio content into visual topic-based segments
KR102274842B1 (en) * 2020-01-31 2021-07-08 견두헌 Apparatus for classifying music genres that express genres by color mixing and method thereof
WO2022265132A1 (en) * 2021-06-17 2022-12-22 견두헌 Music genre classification apparatus and method for expressing genre by using color mixing method
US12148408B2 (en) * 2023-09-05 2024-11-19 Tree Goat Media, Inc. Systems and methods for transforming digital audio content into visual topic-based segments

Also Published As

Publication number Publication date
US8686270B2 (en) 2014-04-01

Similar Documents

Publication Publication Date Title
US8686270B2 (en) Apparatus and method for classifying, displaying and selecting music files
EP2159719B1 (en) Method for graphically displaying pieces of music
JP4938781B2 (en) Information presenting apparatus, information presenting method, information presenting program, and integrated circuit
US9372860B2 (en) Method, system and device for content recommendation
US9671859B2 (en) Information processing device, client device, server device, list generation method, list acquisition method, list providing method and program
US8838617B2 (en) Method and apparatus for searching for recommended music using emotional information of music
US8255428B2 (en) Graphical representation of assets stored on a portable media device
US7521620B2 (en) Method of and system for browsing of music
US20060224260A1 (en) Scan shuffle for building playlists
EP2410444A2 (en) System and method for dynamic generation of individualized playlists according to user selection of musical features
CN101399037B (en) Method and device for providing an overview of pieces of music
US20120117071A1 (en) Information processing device and method, information processing system, and program
US20100063952A1 (en) Music Information Processing Apparatus, Music Delivering System, And Music Information Processing Method That Can Satisfy A Request From A User
JP2006039704A (en) Play list generation device
EP1965322A1 (en) Information processing apparatus, information processing method, and information processing program
JP2007249740A (en) Content selection device and program
JPWO2008041764A1 (en) Music artist search apparatus and method
JP4916945B2 (en) Music information grant server, terminal, and music information grant system
CN106407286A (en) Music search method and apparatus
US9189887B2 (en) Information processing apparatus and information processing method
US20140068474A1 (en) Content selection apparatus, content selection method, and computer readable storage medium
JP6045069B2 (en) Operation terminal, karaoke system, and karaoke program
JP6736914B2 (en) Music selection device, karaoke system, and music selection program
JP2004070510A (en) Device, method and program for selecting and providing information, and recording medium for program for selecting and providing information
US20100275158A1 (en) System and a method for providing events to a user

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:EGGINK, JANA;GIRON, FRANCK;KEMP, THOMAS;SIGNING DATES FROM 20110401 TO 20110416;REEL/FRAME:026439/0974

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551)

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8