[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2014152422A1 - Système de mercatique interactive - Google Patents

Système de mercatique interactive Download PDF

Info

Publication number
WO2014152422A1
WO2014152422A1 PCT/US2014/027325 US2014027325W WO2014152422A1 WO 2014152422 A1 WO2014152422 A1 WO 2014152422A1 US 2014027325 W US2014027325 W US 2014027325W WO 2014152422 A1 WO2014152422 A1 WO 2014152422A1
Authority
WO
WIPO (PCT)
Prior art keywords
video
product
user
interactive
content management
Prior art date
Application number
PCT/US2014/027325
Other languages
English (en)
Inventor
Andrei Paul AVERBUCH
Original Assignee
Gravidi, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US13/837,915 external-priority patent/US9277157B2/en
Application filed by Gravidi, Inc. filed Critical Gravidi, Inc.
Publication of WO2014152422A1 publication Critical patent/WO2014152422A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising

Definitions

  • This invention relates generally to a content integration and delivery system and a method of using the same.
  • the present invention relates to a system that integrates digital video content with object-oriented script to provide object specific marketing, including potential channels of distribution, such as internet shopping.
  • the system includes the process of identifying and marking the relative location of specific objects within digital media and providing an outlay for the users to locate the object over the global system of interconnected computer networks (e.g. the Internet).
  • the marketing/advertisement approach and lack of access to additional related content is still the same. That is, for example, evenly distributed throughout the digital media, e.g. , in the case of a full-length movie, the service provider sets time points at which the movie stops and an advertisement runs, and there is no ready access to additional content. Typically, all of this additional content is integrated and digitized together.
  • the drawback of the current system of advertisement, marketing, and additional content delivery is that the viewer may get agitated by the interruptions and has little or no connection to the product being promoted or the ability to access the information at any time during the playback.
  • an interactive marketing system allows the viewer to pick objects within digital video content for immediate or subsequent consideration.
  • the system integrates digital video content with object-oriented localization script to provide object specific marketing.
  • the system includes the process of identifying and marking the objects displayed within the digital media and providing an outlet for the viewers to locate the object over the internet or a preloaded database.
  • a digital content delivery or on-line shopping system/method involving TV, DVD player, and Internet is provided. While this embodiment is described with reference to a DVD player, those skilled in the art would recognize that any internet-connected device capable of delivering and presenting digital video media can be employed with the disclosed digital content delivery and on-line shopping system/method.
  • a remote control or other interface the user can sign up with a service provider and receive commercial product(s) information, and other desired or relevant information, which can be stored as a multilayer of overlapping content in DVD or locally in the memory of the device.
  • overlapping content is called a metadata layer and can include information about a target object, region or theme other than its mapping data. For example, it may include textual, image, video, audio and other descriptions of the target, as well as links to further related data outside the physical realm of the video such as the internet.
  • the user can retrieve the metadata layer at any time during the DVD, or other format video media, playback.
  • the user either can view the downloaded digital content (movie, video or audio) or review what particular products are available for purchase, or what objects within the media are tagged (or marked-up) using the metadata.
  • the user will access the data as he is accessing any regular DVD or other format video media. If the user is ready to do on-line shopping or view the objects within the media that are marked-up or tagged, the user has an option to pause the DVD or other format video media playback. This action also freezes the overlaying featured product or objects featured in the media metadata layer. The user can tab through visible highlighted products or objects featured in the digital media.
  • the user can place an order using a pre-stored credit account through the internet merchant gateway and have the product shipped to a pre-stored shipping address, or just find out more information about objects featured in the digital media.
  • the user of an exemplary embodiment of the invention can readily interact with the system by using remote control or analogous behavior in the device's user interface to select and purchase a desired product, or just find out more information about objects featured in the media.
  • FIG. 1 is an overview of an exemplary main system with major components and their relationships
  • FIG. 2 is a flowchart of an exemplary ordering process
  • FIG. 3 is an example of using XML language to implement the Metadata structure for an available product
  • FIG. 4 shows the flowchart of an exemplary data retrieve process after the user selects a product from the displayed list.
  • FIG. 5 shows the flowchart of an exemplary items generation process in the content management system (CMS component or CMS Web Tool).
  • CMS component CMS Web Tool
  • FIG. 6 shows the flowchart of exemplary steps taken in the composer tool, i.e., the process of linking items to objects in the video stream via hotspots and its relationship to the CMS Web Tool.
  • FIGs. 7A-7B shows the flowchart of exemplary steps taken in the player tool, i.e., the process of selecting information from the interactive overlay.
  • FIG. 1A The generalized system configuration for integrating digital content with object-oriented script(s) to provide object specific marketing is illustrated in FIG. 1A.
  • the system 1 includes a display device 8, a digital media reproduction device 10 with user interface 16, and an Internet access means 6, such as phone line, cable line, electrical line, optical line or a wireless connection.
  • an Internet access means 6 such as phone line, cable line, electrical line, optical line or a wireless connection.
  • the digital media reproduction device 10 can be selected from any visual processing device capable of playing video (i.e., converting data into a visual information), such as, but not limited to, DVD/Blue-ray player, an internet enabled TV, a desktop or laptop computer, a tablet type computing device, and a mobile device (e.g. a smart phone).
  • video i.e., converting data into a visual information
  • a mobile device e.g. a smart phone
  • Such media reproduction device 10 is capable of playing streamed or locally encoded content identified as one or more of MPEG, MPEG-2, MPEG-4, Flash, AVC, AVHCD, DIVX, DV-AVI, H.264, Matroska MKV, Quicktime, Tivo Video, and others, (see www.fileinfo.com/filetypes/video for additional formats).
  • the media reproduction device 10 can be connected to the display device 8 via cables, wires, or a wireless connection.
  • the display device 8 can be a screen device that is connected to or embedded in the media reproduction device 10 and is capable of displaying video media, such as a TV, a computer, a tablet mobile device, etc.
  • the media reproduction device 10 and the display device 8 are interconnected and confined within the same device, e.g. a tablet.
  • the user interface 16 can transmit the user's input by wireless/wired means to the media reproduction device 10.
  • the user interface 16 can be any interface device such as a remote control (infra-red, radio frequency or Bluetooth), a mouse, a touchpad (e.g. touch screen tablets) or a combination thereof.
  • a remote control infra-red, radio frequency or Bluetooth
  • the user interface 16 can be a remote control transmitting commands to the media reproduction device 10 through an infrared signal.
  • the user interface 16 is used to interact with the object-oriented script profiles (or video embedded entities) within the encoded video content in the media reproduction device 10 using an interface script that calls up an object related information from the service provider 2 over the internet 4.
  • the video embedded entities are objects, and other concepts such as products, locations, semantic and abstract contexts related to objects or other distinct content within the encoded video, defined by semantic category as well as physical location in 2D or 3D space of an entity within the encoded video frame(s). For example, a car in a scene from the movie Bullit; the sunglasses worn by Tom Cruise's character in the film Risky Business; the location of Times Square in NYC in the dream sequence of the film Magnolia; and many more.
  • the interface script is a software program (referred to as authoring software thereafter) written to allow the media reproduction device 10 to display and interact with objects and concrete concepts embedded/displayed within the encoded video through the user interface 16 by rendering an entity display layer and the encoded video simultaneously.
  • This interaction may consist of, but is not limited to, tagging of the video embedded entities, purchasing the video embedded entities showcased in the encoded video directly from the media reproduction device 10, as well as downloading and displaying additional contextual information related to the video embedded entities.
  • tagging of the video embedded entities is done in authoring software.
  • the authoring software allows the user to view the video as well as to play, pause and scrub frames. Scrubbing is moving forward or back through a video timeline at user's own pace using a frame scroll interface. It is analogous to jog-shuttling.
  • the authoring software allows the user to view a frame-by-frame scroll view representation of the video for easy back and forth scrubbing, and gives the user an interface input capability by using a touch or a mouse or another human to computer interface to create a 2D or 3D mapping or a visual representation of a desired area or object in a video frame.
  • Scroll view is a linear display of sequential frames in the video which can be scrolled left or right to reveal more frames from an earlier or later time range of the video.
  • the user has the ability to "tag" video content through the selection of objects viewed in the authoring software.
  • This selection may take place on a touch enabled device in which the user can tap or touch the object to select it.
  • the selection may take place using a computer and a mouse to select the object.
  • selection of objects may take place through physical gestures, voice or other means.
  • the authoring software also allows the user to continue mapping the object or area (motion tracking) by using software image recognition or by manually navigating frame by frame and adding new coordinates to a sequential list of 2D or 3D coordinates until the object or area is no longer in view. After an object is tagged the object can be tracked through the video frames in which the object remains visible. In one embodiment the user can manually continue mapping the object by navigating frame by frame and adding coordinates to a sequential list. In another embodiment software can algorithmically identify and track objects and provide coordinates to the authoring software. The authoring software also allows the user to relate the tag coordinates list to a unique ID representing a set of metadata describing the object or area tagged by the coordinates.
  • the user can also edit any existing coordinate set in any frame after it has already been created, either by modifying the coordinates or the unique ID or by deleting the coordinate set for that frame completely.
  • the user is provided with the capability to store a static image or copy of any video frame in relation to the object or area being tagged using the authoring software.
  • the user can as well store the frame range of a tagging sequence related to an object or area on the screen.
  • the user can use the frame range of a tagging sequence to generate independent clips of video from the original frames in that range.
  • the user can generate and store a file representing the tagging coordinate and visual mapping data (i.e., a visual representation of an area inside a video frame, which is a symbolic depiction highlighting relationships between elements within that area, such as objects, regions, and themes), which can then be used to generate visual, audio and/or any other representation of the tag data.
  • the tagging coordinate and visual mapping data i.e., a visual representation of an area inside a video frame, which is a symbolic depiction highlighting relationships between elements within that area, such as objects, regions, and themes
  • the authoring/playback software consists of, but is not limited to, three components: (1) content management system (CMS); (2) composer tool, and (3) playback tool.
  • CMS content management system
  • each component can be part of one interactive system having the capabilities of all three components or each component can be independently implemented.
  • the authoring software can be implemented as part of an apparatus that has at least one processor, and at least one memory including the authoring software therein.
  • the CMS component can be used for creation of one or more items to be reproduced in a designated video stream (see e.g., Fig. 5).
  • the term "items" describe specific and unique objects or areas generated by the CMS component of the authoring software that can be linked to the objects in the video stream using hotspots generated by the composer tool (tag authoring process). The items, therefore, can be used to represent objects, regions and themes in the video frames.
  • the user can create a project, preferably over the web, that includes one or more video files and one or more items with metadata.
  • Each item can include, but is not limited to, (a) metadata information, (b) media information, (c) catalog information, (d) images, (e) secondary videos, and (f) other relevant information, preferably stored on an SQL-type database for future use and manipulation by the user.
  • the user can view a list of items and select which item is related to which object or area using a unique ID key, by incorporating a metadata item list file generated by the CMS component.
  • the user can, among others, create additional users and authoring system administrators categorized by specific permissions and taxonomy related to user's needs, and/or create permissions and taxonomy categories related to user's needs.
  • Taxonomy is defined herein as a particular classification ("the taxonomy of "), arranged in a hierarchical structure.
  • the CMS authoring also allows the user to create a project, which contains other categories and taxonomies allowing the user to create representations and data models or objects or areas generated by the composer tool component (or tag authoring process).
  • the CMS component of the authoring software allows the user to create structures in XML, JSON or other languages, which can be edited to allow for changes in the model data structure of the items, add metadata such as links to images, video and text as well as other abstract information necessary to describe the item, and add, store and edit images, video, and text in an editable format which is specific to the software that will interpret and display it.
  • the CMS component of the authoring software allows the user to generate and store files representing the item metadata using unique
  • the CMS component also allows the user to view a list of tagged objects and areas generated by the tag authoring process. Specifically, The CMS component allows the user (1) to select which item is to be related to a particular object or area using a unique ID key, (2) view an editable list of videos which are already authored and tagged or need to be tagged, (3) view the associated tagged object list generated by the tag authoring process (composer tool), and (4) view a visual or audio or other representation of the tagged objects and areas, if any, while viewing the videos in the list.
  • the CMS authoring component also allows the user to do a search of items based on keywords, text or other metadata, and/or frame location - temporal or spatial in the case of video.
  • the user can automatically generate items from outside of the CMS by using metadata to call outside services to pull relevant images, video, advertising, search results etc. from outside of the CMS, do a search of CMS authored items based on keywords, text or other metadata, and/or frame location - temporal or spatial in the case of video; do a search of tagged objects or areas in the video by time or frame location in the tag authored file generated by the composer tool (tag authoring). All objects and metadata are accessible at any point during the viewing process. Searching can pull up metadata at any point in the viewing process. Searching can be done via keywords or tags that are part of the metadata. Searching can be done via different types of human interaction, such as by using a remote control, a keyboard, a gesture recognition system, a touch screen, audio and voice recognition system and more.
  • the composer tool can be used, either a standalone application or as part of the authoring software, to attach items generated by the CMS component to the objects (hotspots) inside a particular video stream/file.
  • a project can be selected with one or more available items.
  • the video associated with the project contains a plurality of key frames that can be displayed and reviewed at a frame rate of 0.1, 0.5, 1, 2, 5, 10, 24, 60 frames/second or any frame rate therebetween.
  • the key frames can be displayed based on the "scene" within the video.
  • a hotspot (XY or XYZ coordinates) is selected and associated with an item in the project. Additional parameters can also be added that correlate the item to the hotspot and the object of interest.
  • the coordinate system of the hotspots is based on spatial and temporal parameters. For example, in a 2D video, XY coordinates define the location of the object in the frame and t (time) coordinate defines the time stamp of when the object first appears in the video. Thereafter, the video is advanced to the next frame in time.
  • the hotspot will be displaced. In order to track the object the
  • XY coordinates of the hotspot are adjusted to a new t timestamp.
  • the hotspot tracks the object as a function of steps.
  • the location of the hotspot in the remaining frames can be automatically interpolated by the composer tool based on curve fitting (linear, polynomial, spline, Gaussian) or regression analysis.
  • curve fitting linear, polynomial, spline, Gaussian
  • regression analysis regression analysis.
  • the player tool is preferably a standalone application that allows users to view the videos and interact with the overlaid information prepared with the CMS and the composer tool during the playback.
  • the user is provided with a library of the overlaid files that can be played in the player tool, at which points the user can select a video to play.
  • a short introduction about the video can be displayed before the playback is initiated.
  • the playback can start automatically after a predefined pause (e.g., 2-10 seconds) or the user can initiate the playback manually by pressing "PLAY".
  • the graphical video overlay which visually shows the viewer/user the interactive areas of the video, can be revealed automatically or by an action of the user (e.g., touching the touch-sensitive screen). At any time during the playback, the user can select a displayed icon with an object of interest.
  • the service provider 2 can have a merchant gateway, third party content/product feeder and the web-subscription services.
  • a service provider 2 delivers the streaming media and metadata about the objects represented in the streaming media through the
  • the service provider 2 may also provide user services such as e-commerce gateways, social sharing, location and other services, related to the video embedded entities tagged within the encoded video content.
  • the interface script can be stored in the firmware of the media reproduction device 10 (e.g. a random-access memory or RAM 14).
  • the interface script can be stored in a video media 12 (e.g. a DVD disk) or downloaded from the service provider 2 through Internet 4 and subsequently loaded into the RAM 14 of the media reproduction device 10 before the program is executed.
  • a user can initiate the program by clicking a predetermined key or a pre-determined sequence on the user interface 16 (Step 20).
  • a menu of options appears on the display device 8 (Step 22). If this is the first time that the user has invoked the program (Step 24), the user can select an option from the menu to set his/her personal profile, which may include a user ID, password, credit card number and/or other personal information (Step 26).
  • the supplied information can be stored locally, on service provider's server, or remotely with a third party service (e.g. Google Wallet), and is preferably saved in encrypted form.
  • a third party service e.g. Google Wallet
  • the service provider 2 can start to download data into the User's media reproduction device 10 (Steps 30 and 36).
  • the downloaded information can be product catalogs, commercial advertisements, marketing promotions or the entertainment package(s) the user purchased, such as movie, audio data, music, or even a software program.
  • the downloaded data can be temporally stored in the RAM 14 of the User's media reproduction device 10 and subsequently stored in a secondary media (e.g. hard disk, DVD media, flash memory etc.) in Step 38.
  • the downloaded data can be permanently stored in the RAM 14 of the User's media reproduction device 10.
  • the authoring/playback software can use a the predefined format of metadata and the interface (product display layer) to define and link specific video content to object-specific marketing.
  • the product display layer is a specific interface format of interface layered on top of and time-synched with the video content, which is also spatially and temporally synched with hotspots in the video content.
  • Synched video content may include additional video content synched to the original video timeline.
  • Synched video content may include alternative versions of the original video content.
  • Synched video content may consist of 3D video-game engine content synched to the original video timeline and 2D/3D coordinates of objects in original video.
  • the overlay- layer can display video embedded entity markers as well as other contextual details and information.
  • the markers can be defined as a collection of data points that describes and facilitates the display of graphic markers on a display device in the context of 2D or 3D location within the dimensions of the visualized on the screen video content.
  • the marker can appear and disappear on the visualized video content.
  • the time stamp defines when the marker appears and the duration of the marker's appearance on the visualized video content relative to the video time frame and encoding.
  • a third party can use the disclosed format structure to define his/her products and go through the service provider to offer its products to the end user.
  • the metadata file contains a list of different markers that define an item.
  • one of the markers may be a timecode, which is in synch with the timecode of the media reproduction device.
  • Other markers may include information about specific products featured in the media content, a product-id, and the information about a hotspot area of the video embedded entities.
  • the metadata file may also include two- dimensional (2D) or three-dimensional (3D) coordinates to indicate where the hotspot associated with the item will be displayed on the display device.
  • the metadata file would also contain a reference to the latest version of its related Featured Product Information Data file ("FPID").
  • FPID contains product specific data tied to the Metadata file's hotspot by product id as a key. This FPID data can include Brand Name, Price, Options and other information.
  • Both the metadata and FPID files can be accessed either through an online download or from locally stored video media (e.g. DVD disk, digital file, etc.) that may contain the metadata and/or FPID file on extra tracks. The presence of such data on the media, as defined herein, makes such media content system compatible. To be a compatible format, the media must contain the FPID file in synch with the content stored on separate tracks of the media.
  • the authoring/playback software creates the product display layer using data selected from text, graphics, display style information, such as font, colors, size, position etc., from the metadata file and its associated FPID file.
  • the product display layer contains the hotspots created using the data from the two files. This allows a user to point to or highlight the product(s) to obtain additional information, to make a purchase selection and/or to perform other actions with the product.
  • All the downloadable data, the user personal profile, the authoring/playback software, merchant account information, user profile files, FPID, and user activity history file can be storable in digital form on a secondary storage either locally or remotely in cloud storage, preferably in an encrypted and secure format.
  • the authoring/playback software can provide an index to organize these data during storing or retrieving.
  • the FPID includes product information, such as brand name, price, availability, seller (if many) manufacturing information, reviews and options. This information can be held in a variety of formats, such as a delimited text file, an xml file, or an image file. All information can be stored locally, online, or on extra tracks of the media specifically supporting the authoring/playback system.
  • the downloaded data is in a compatible format, the downloaded data will include an overlapping layer of metadata.
  • the downloaded data may also include other programs, such as interface routines that describe and manipulate information about specific content existing in the media.
  • the system Upon receiving the downloaded data, the system retrieves a timecode from the media reproduction device which has a built-in real-time clock.
  • the metadata layer contains timecode markers, which are in synch with the video content timecode.
  • the metadata layer can also contain related display data (text, graphics, style information [font, colors, etc.]), including two-dimensional or three-dimensional coordinates for the product(s) to indicate where it will be displayed on the display device.
  • the authoring/playback software also allows a user through using a remote control to point to or highlight the displayed product(s) to make a selection.
  • the downloadable data can be in preexisting media content formats or in new authoring/playback software compatible format.
  • the overlay data can be written into the media content with necessary index and information, which can be retrieved later when the video data is retrieved from the media.
  • the user if the user has already registered his account with the service provider, the user turns on both display device and the media reproduction device, and connects the media reproduction device to the Internet.
  • the user can operate the media reproduction device in a traditional manner or instead can click a designated key of the media reproduction device interface. For example, the user may click on the "info" key, to pause the media reproduction device playback operation, and then another designated key to invoke the authoring/playback software (also refered to as DEIISS program/system) (Step 20).
  • the program checks if any metadata exists that corresponds to the media being played (Step 30). In that regard, the program automatically connects to the service provider through Internet to download an updated metadata file (Step 30).
  • the system hands the control to the regular media reproduction device interface software to run the regular video media functions
  • Step 32 Alternatively, if a new or an updated metadata file exists from the service provider, it is downloaded into the media reproduction device (Step 36). Meantime, the program can also check if there is new FPID. The user can decide if he/she wants to view the FPID or not. Using the user interface of the media reproduction device, the user can carry out all functions of the DEIISS system. For example, by clicking a designated key of the media reproduction device interface, a menu of selections is displayed on the display device (Step 22), where several choices, such as manage user profile; manage credit card information; view purchase history; view playback history; download software update; and view 3 rd party products.
  • the user can click a key or sequence of keys to select one of these options and follow the self-explanatory description displayed on the display device to proceed. Therefore, a user using the media reproduction device user interface can retrieve the metadata layer at any time during the video playback, and views all available items (video embedded entities) with their description in detail. The user can also pause the video playback, then tab through the visible tags that describe a particular item and highlight the desired item.
  • the System retrieves the relevant FPID file and the metadata file from the storage and compares the media reproduction device's timecode with the time code of the metadata.
  • the system can also retrieve the hotspot (window display area/video embedded entity marker) information, which contains the display information regarding to the item
  • the user By selectively pushing the key of the media reproduction device user interface, the user goes through menu displayed and selects the items he/she wants to display, manipulate, purchase or other activity. At the end, a summary of the user's selections will be displayed. The user can either confirm the selections or go back to change the selections before submitting, for example, the purchase order, if this is the functionality the user is performing on the items.
  • the order includes the product(s) (items or video embedded entities) he/she wants to purchase, personal account information including shipping and billing addresses.
  • an exemplary embodiment discussed above is made in reference to a media content stored locally on a DVD/Blue-ray type devices, the scope of the invention is not particularly limited to only locally distributed media.
  • the user can purchase entertainment products, such as movie, music etc., in downloadable digital format from the service provider. Once the entertainment product is downloaded, it can be locally stored either on the secondary storage (e.g internal hard disc) or on a writable DVD/Blue-ray. Alternatively, the entertainment product can be streamed from the service provider or a third-party service provider.
  • the authoring and playback system consists of three components: (1) content management system/software; (2) composer tool, and (3) playback tool. Each component is independently implemented, although, the components can be part of one interactive system having the capabilities of all three components.
  • This example illustrates the design and implementation of the first component of the authoring and playback system - the interactive content management system (CMS) 100 for creation of one or more items (video embedded entities) to be reproduced in a designated video stream (see Fig. 5).
  • CMS interactive content management system
  • the user creates an account, preferably over the web as illustrated in the screen capture 111.
  • the user can create a project as shown in the screen capture 112.
  • the user has an option to name the project, provide its status (e.g., pending or processed) and category.
  • a video file of any format can be readily uploaded in Step 102.
  • the preferred video format is H264/MPEG-4 Part 10 (Advanced Video Coding).
  • Each items includes, but not limited to, (a) metadata information, (b) media information, (c) catalog information, (d) images, (e) secondary videos, and (f) other relevant information.
  • the metadata information may include name of the item, type/purpose of the item (e.g., product, service, etc.), related projects, images, description either in plain or formatted text (e.g., html encoded), JavaScript, merchandise icon, source, URL (uniform resource located; also known as web address), and geotag.
  • the information is stored on an SQL-type database for future use and manipulation by the user.
  • FIG. 6 illustrates the relationship between the composer tool 200 and the interactive CMS 100.
  • the composer tool 200 is preferably a standalone application that allows users to attach items generated by the CMS 100 to objects inside a particular video stream/file via a process called tagging. In one exemplary embodiment, all processed objects are available for editing in the composer tool 200.
  • a project is selected with one or more available items. After processing, the video associated with the project contains a plurality of key frames that can be displayed and reviewed in step 202 at a frame rate of 0.1, 0.5, 1, 2, 5, 10, 24, 60 frames/second or any frame rate therebetween.
  • the key frames can be displayed based on the "scene" within the video.
  • a hotspot is selected and additional parameters are added that correlate the item to the object of interest.
  • the additional parameters may include XY coordinate that define the location of the object in the frame and t coordinate defines the time stamp of when the object first appears in the video.
  • the video is advanced to the next frame in time. Since the object may move, in order to track the object the XY coordinates of the hotspot are adjusted to a new t timestamp. Similar approach is taken with a 3D video content with additional coordinate system in Z. Thus, for the duration the object appears on the screen, the hotspot tracks the object as a function of steps.
  • the location of the hotspot in the remaining frames can be automatically interpolated by the composer tool 200. That is, the new hotspots within the range of a discrete set of known hotspots (steps) are reconstructed based on curve fitting (linear, polynomial, spline, Gaussian) or regression analysis.
  • This example illustrates the design and implementation of the third component of the authoring and playback system - a playback (player) tool 300 as shown in Figs. 7A-7B.
  • the player tool 300 is preferably a standalone application that allows users to view the videos and interact with the overlaid information prepared with the CMS 100 and the composer 200 during the playback.
  • the user is provided with a library 310 of the overlaid files that can be played in the player tool 300.
  • a video is selected.
  • a short introduction about the video can be displayed before the playback is initiated (see Screen Capture 311).
  • the playback can start automatically after a predefined pause (e.g., 2-10 seconds) or the user can initiate the playback manually by pressing "PLAY" (see Screen Captures 311).
  • the graphical vide overlay which visually shows the viewer/user the interactive areas of the vide, can be revealed automatically or by an action of the user (e.g., touching the touch-sensitive screen).
  • the user can select a tag (displayed icon) associated with an object of interest in step 303. By selecting the icon (tag), an item is called and, in one exemplary embodiment, a compact gallery can be triggered for an actor.
  • additional information can be displayed (312 & 313).
  • the information can be linked via URL to an outside source that can be displayed by the player using its internal web browser.
  • the displayed information can be shared using one of available social media (e.g., Facebook, Twitter, etc.).
  • This example illustrates the design and implementation of the authoring and playback system in a screenplay product placement.
  • the system allows screenwriters to tag objects, regions and themes in the screenplay, then submit the screenplay to a web-based platform, which allows advertisers/interested parties to bid on product placement related to tagged objects, regions and themes in the screenplay.
  • Screenwriters are allowed to highlight words or paragraphs describing objects, regions and themes in their screenplay. These highlighted words can then be related to Items created in the CMS system, which then describe them.
  • Advertisers are allowed to view/search lists of items and their screenplay context, categorized by keywords. They can then select an item on which they can bid for product placement. Other advertisers are allowed to bid on the same item. Once items are purchased, Advertisers use the CMS system to add metadata such as text, images, video and other media to the Item purchased.
  • the composer tool for tag authoring is used to tag the item in the video frames and to allow viewers to display the metadata related to that object, region or theme in the video frame which is produced with and for the advertiser who owns the product placement item.
  • FIG. 3 This example illustrates as shown in FIG. 3 the structure of a sample metadata file written in XML language.
  • the metadata file entitled metadata_spiderman2.xml.
  • the file is designed for use with a movie Spiderman 2 released in 2004 on DVD format.
  • the metadata file identifies the target video content with a unique id number.
  • the metadata file contains the advertisers name, contact information, and the product to be sold. In this particular example it is Mattel ® Toys Spidey T-Shirt.
  • the metadata file ends with the seller location.

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Accounting & Taxation (AREA)
  • Development Economics (AREA)
  • Strategic Management (AREA)
  • Finance (AREA)
  • Game Theory and Decision Science (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

Cette invention porte d'une manière générale sur un système d'intégration et de distribution de contenu et sur un procédé d'utilisation de ce système. En particulier, la présente invention porte sur un système qui intègre du contenu vidéo numérique à du script orienté objet (zones sensibles) pour assurer une mercatique spécifique d'objet, comprenant des canaux potentiels de distribution, tels que l'achat sur Internet. Le système comprend le processus d'identification et de marquage de l'emplacement relatif d'objets spécifiques dans du contenu multimédia numérique et la fourniture d'une incrustation inverse permettant aux utilisateurs de localiser l'objet sur le système mondial de réseaux informatiques interconnectés (par exemple Internet). Par utilisation de la plateforme interactive décrite, un utilisateur peut facilement visualiser, afficher, sélectionner et acheter n'importe quel produit apparaissant dans la vidéo par l'intermédiaire d'Internet.
PCT/US2014/027325 2013-03-15 2014-03-14 Système de mercatique interactive WO2014152422A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13/837,915 US9277157B2 (en) 2004-09-01 2013-03-15 Interactive marketing system
US13/837,915 2013-03-15

Publications (1)

Publication Number Publication Date
WO2014152422A1 true WO2014152422A1 (fr) 2014-09-25

Family

ID=51581213

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/027325 WO2014152422A1 (fr) 2013-03-15 2014-03-14 Système de mercatique interactive

Country Status (1)

Country Link
WO (1) WO2014152422A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10878138B2 (en) 2017-02-23 2020-12-29 Mitek Holdings, Inc. Method of managing proxy objects

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070260677A1 (en) * 2006-03-17 2007-11-08 Viddler, Inc. Methods and systems for displaying videos with overlays and tags
US20100010893A1 (en) * 2008-07-09 2010-01-14 Google Inc. Video overlay advertisement creator
US20110112914A1 (en) * 2009-06-04 2011-05-12 Viacom International, Inc. Dynamic integration and linear presentation of advertising content and media content
US20120308206A1 (en) * 2007-10-07 2012-12-06 Fall Front Wireless Ny, Llc Digital network-based video tagging with tag filtering

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070260677A1 (en) * 2006-03-17 2007-11-08 Viddler, Inc. Methods and systems for displaying videos with overlays and tags
US20120308206A1 (en) * 2007-10-07 2012-12-06 Fall Front Wireless Ny, Llc Digital network-based video tagging with tag filtering
US20100010893A1 (en) * 2008-07-09 2010-01-14 Google Inc. Video overlay advertisement creator
US20110112914A1 (en) * 2009-06-04 2011-05-12 Viacom International, Inc. Dynamic integration and linear presentation of advertising content and media content

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10878138B2 (en) 2017-02-23 2020-12-29 Mitek Holdings, Inc. Method of managing proxy objects
US11314903B2 (en) 2017-02-23 2022-04-26 Mitek Holdings, Inc. Method of managing proxy objects
US11687684B2 (en) 2017-02-23 2023-06-27 Mitek Holdings, Inc. Method of managing proxy objects
US12079545B2 (en) 2017-02-23 2024-09-03 Mitek Holdings, Inc. Method of managing proxy objects

Similar Documents

Publication Publication Date Title
US9277157B2 (en) Interactive marketing system
US11432033B2 (en) Interactive video distribution system and video player utilizing a client server architecture
JP6803427B2 (ja) コンテンツトランザクションアイテムの動的バインド
US9912994B2 (en) Interactive distributed multimedia system
US10872193B2 (en) Method for publishing composite media content and publishing system to perform the method
US10506278B2 (en) Interactive video distribution system and video player utilizing a client server architecture
US8412021B2 (en) Video player user interface
US9888289B2 (en) Liquid overlay for video content
US20150095782A1 (en) System and methods for providing user generated video reviews
US20140282743A1 (en) Shoppable video
US9171137B2 (en) Systems and methods for enabling an automatic license for mashups
US9203917B1 (en) Tracking user affinity through interactions with media files
US10042516B2 (en) Lithe clip survey facilitation systems and methods
US20180249206A1 (en) Systems and methods for providing interactive video presentations
US20170024097A1 (en) Method and Host Server for Creating a Composite Media File
WO2014152422A1 (fr) Système de mercatique interactive
US20130262200A1 (en) Method and Apparatus for the identification of products displayed in media programs
EP3301630A1 (fr) Procédé de publication de contenu multimédia composite et système de publication permettant d'exécuter le procédé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14768773

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14768773

Country of ref document: EP

Kind code of ref document: A1