[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN111144914B - System, method and apparatus for outputting information - Google Patents

System, method and apparatus for outputting information Download PDF

Info

Publication number
CN111144914B
CN111144914B CN201811306145.9A CN201811306145A CN111144914B CN 111144914 B CN111144914 B CN 111144914B CN 201811306145 A CN201811306145 A CN 201811306145A CN 111144914 B CN111144914 B CN 111144914B
Authority
CN
China
Prior art keywords
article
information
item
position change
identification
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811306145.9A
Other languages
Chinese (zh)
Other versions
CN111144914A (en
Inventor
侯晓宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jingdong Technology Holding Co Ltd
Original Assignee
Jingdong Technology Holding Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jingdong Technology Holding Co Ltd filed Critical Jingdong Technology Holding Co Ltd
Priority to CN201811306145.9A priority Critical patent/CN111144914B/en
Publication of CN111144914A publication Critical patent/CN111144914A/en
Application granted granted Critical
Publication of CN111144914B publication Critical patent/CN111144914B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0281Customer communication at a business location, e.g. providing product or service information, consulting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Development Economics (AREA)
  • Finance (AREA)
  • Physics & Mathematics (AREA)
  • Strategic Management (AREA)
  • Accounting & Taxation (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Game Theory and Decision Science (AREA)
  • Human Computer Interaction (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a system, a method and a device for outputting information. One embodiment of the system comprises: an inductor configured to send an item identification of the item to the control terminal in response to detecting that the item to be displayed is moved or obscured; the control terminal is configured to respond to the received article identification sent by the sensor, search article information corresponding to the article identification from a preset article information base and output the article information to the display; and a display configured to display the item information. The embodiment realizes the output of the information of the article according to the requirement of the user.

Description

System, method and apparatus for outputting information
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a system, a method and a device for outputting information.
Background
In the super store, the existing commodity display mode is mainly a video playing amplifying screen for circularly playing introduction information of promoted commodities, but the mode has some defects. For example, the video sequence of the commodity introduction played by the large screen is fixed, and the commodity introduction video sequence can not be randomly jumped to the commodity of interest of the customer to be played according to the actual needs of the customer; the commodity display mode is single, and customer participation is low, can not attract customer to stay for a long time, and customer commodity experience nature is relatively poor simultaneously.
Disclosure of Invention
The embodiment of the application provides a system, a method and a device for outputting information.
In a first aspect, an embodiment of the present application provides a system for outputting information, including: an inductor configured to send an item identification of the item to the control terminal in response to detecting that the item to be displayed is moved or obscured; the control terminal is configured to respond to the received article identification sent by the sensor, search article information corresponding to the article identification from a preset article information base and output the article information to the display; and a display configured to display the item information.
In some embodiments, the sensor is further configured to send location change information of the item to the control terminal in response to detecting a location change of the item; and the control terminal is further configured to: and in response to receiving the article identification and the position change information sent by the sensor, searching article information corresponding to the article identification from a preset article information base, and outputting the article information to a display according to the position change information.
In some embodiments, the item information includes at least one of: text information, video information, image information, and naked eye 3D models.
In some embodiments, the system further comprises: the naked eye 3D player is configured to play a naked eye 3D model of the object.
In some embodiments, the inductor comprises at least one of: a proximity sensor configured to send an item identification of the item to the control terminal in response to detecting that the item is obscured beyond a predetermined time threshold; and a touch sensor configured to transmit an item identification and location change information of the item to the control terminal in response to detecting that the item is moved.
In some embodiments, the control terminal is further configured to: and sending the output state of the article information to the sensor.
In some embodiments, the inductor further comprises: and an LED lamp group configured to light the LED lamp according to the output state in response to receiving the output state.
In a second aspect, an embodiment of the present application provides a method for outputting information, including: receiving article identification and position change information of an article to be displayed; searching article information corresponding to the article identifier from a preset article information base; and outputting the article information according to the position change information.
In some embodiments, the method further comprises: and transmitting the output state of the article information.
In a third aspect, an embodiment of the present application provides a method for outputting information, including: detecting whether the time for which the object to be displayed is shielded exceeds a preset time threshold; if yes, sending an article identifier of the article; detecting whether the position information of the article changes; and if the change occurs, sending the article identification and the position change information of the article.
In some embodiments, the method further comprises: and in response to receiving the output state of the article information, illuminating the LED lamp according to the output state.
In a fourth aspect, an embodiment of the present application provides an apparatus for outputting information, including: a receiving unit configured to receive article identification and position change information of an article to be displayed; the searching unit is configured to search article information corresponding to the article identifier from a preset article information base; and an output unit configured to output the item information according to the position change information.
In a fifth aspect, an embodiment of the present application provides an apparatus for outputting information, including: a proximity sensing unit configured to detect whether a time for which an item to be displayed is blocked exceeds a predetermined time threshold; a first transmitting unit configured to transmit an item identification of the item if the item is blocked beyond a predetermined time threshold; a touch sensing unit configured to detect whether or not position information of an article is changed; and a second transmitting unit configured to transmit the article identification and the position change information of the article if the position information of the article is changed.
In a sixth aspect, an embodiment of the present application provides an electronic device, including: one or more processors; and a storage device having one or more programs stored thereon, which when executed by the one or more processors cause the one or more processors to implement the method as one of the second or third aspects.
In a seventh aspect, an embodiment of the present application provides a computer readable medium having stored thereon a computer program, wherein the program when executed by a processor implements a method as in one of the second or third aspects.
The system, the method and the device for outputting information provided by the embodiment of the application trigger the display to output the article information of the article by detecting whether the article is moved or blocked. So that information of the object of interest to the user can be displayed according to the user's demand.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the detailed description of non-limiting embodiments, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which an embodiment of the present application may be applied;
FIG. 2 is a schematic diagram of the hardware configuration of the inductor of the present application;
FIG. 3 is a flow chart of one embodiment of a method for outputting information in accordance with the present application;
FIG. 4 is a flow chart of yet another embodiment of a method for outputting information in accordance with the present application;
FIG. 5 is a schematic diagram of an application scenario of a method for outputting information according to the present application;
FIG. 6 is a schematic diagram of an embodiment of an apparatus for outputting information in accordance with the present application;
Fig. 7 is a schematic structural view of still another embodiment of an apparatus for outputting information according to the present application;
fig. 8 is a schematic diagram of a computer system suitable for use in implementing an embodiment of the application.
Detailed Description
The application is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting of the application. It should be noted that, for convenience of description, only the portions related to the present application are shown in the drawings.
It should be noted that, without conflict, the embodiments of the present application and features of the embodiments may be combined with each other. The application will be described in detail below with reference to the drawings in connection with embodiments.
Fig. 1 illustrates an exemplary system architecture 100 for a system for outputting information to which the present application may be applied.
As shown in fig. 1, in this embodiment, the system includes:
The sensor 101 is configured to send an item identification of the item to the control terminal in response to detecting that the item to be displayed is moved or obscured. The sensor is mounted over the item to be displayed. Each article has its own sensor which is bound to the article identification. The sensor is in communication connection with the control terminal in a wireless manner (e.g., bluetooth, WIFI, etc.). The user's stay in the vicinity of which item for a longer time or move the item indicates that the user is interested in the item. And the sensor sends an article identification of the article to the control terminal when detecting that the article is blocked beyond a preset time threshold value, so that the control terminal outputs article information of the article.
The control terminal 102 is configured to respond to the received article identification sent by the sensor, search article information corresponding to the article identification from a preset article information base, and output the article information to the display. The article information base stores the correspondence between the article information of various articles and the article identification. The item information may include at least one of: text information, video information, image information, and naked eye 3D models. The item information may be indexed by item identification.
A display 103 configured to display item information. The display may be a liquid crystal display or a CRT. But also a display for displaying 3D images.
In some optional implementations of the present embodiment, the system may further include a naked eye 3D player 104 configured to play a naked eye 3D model of the item.
As shown in fig. 1, the core parts of the system are a control terminal and an induction controller. The control terminal is a main control unit of the whole system, wherein a Wifi communication unit, a Bluetooth receiving unit, an HDMI audio/video output unit, a power supply unit and the like are configured, and the control terminal is responsible for acquiring, analyzing and sending equipment messages. The sensor may include a proximity sensor and a touch sensor, or may include only one of them. The proximity sensor may include an infrared sensor unit, a Wifi communication unit, a power supply unit, and the like. The touch sensor may include a bluetooth unit, a motion detection unit, a power supply unit, etc., and the structure diagram of the specific intelligent hardware display system is shown in fig. 2.
The main control unit in fig. 2 integrates Wifi and bluetooth modules as channels for communication with the touch sensor and the proximity sensor and the control terminal. The proximity sensor in the sensor is an infrared ranging module, whether the user approaches the object is judged by capturing the returned distance parameter in real time, and if the user is within a preset distance range, timing is started. Meanwhile, the number of the lightened LED lamps can be increased along with timing. And when the trigger timing is greater than the set value, sending the article identification of the article bound by the sensor to the control terminal through the Wifi module. The touch sensor can be internally provided with an acceleration sensor and a Bluetooth module, the acceleration sensor continuously acquires state information of whether the XYZ triaxial data are fused, compared and separated, and the state information is broadcasted and sent to the main control unit through the Bluetooth module, so that the triggering state of the sensor is judged. And if the main control unit judges that the object is moved, the object identification is sent to the control terminal.
In some optional implementations of this embodiment, the sensor is further configured to send information of the change in position of the item to the control terminal in response to detecting the change in position of the item, and the control terminal is further configured to: and in response to receiving the article identification and the position change information sent by the sensor, searching article information corresponding to the article identification from a preset article information base, and outputting the article information to a display according to the position change information. The item information may change in real time with the location of the item. For example, when the sensor detects that the article is turned upside down, the sensor transmits the position change information to the control terminal. After the control terminal searches the information of the article, the information related to the bottom of the article is output, for example, the detailed text introduction of the bottom and the image or video of the bottom. And synchronously adjusting the naked eye 3D model according to the position change information along with the actual position change of the object, and then sending the position change information to a naked eye 3D player for playing.
In some optional implementations of this embodiment, the control terminal is further configured to: and sending the output state of the article information to the sensor. The inductor further comprises: and an LED lamp group configured to light the LED lamp according to the output state in response to receiving the output state. The output state may be playing, waiting to play, etc., and may be represented by LED lights of different colors.
The integrated optimization design can be carried out on each unit, so that the volume is reduced and the power consumption is reduced. The main control unit continuously monitors the output state returned by the Wifi and Bluetooth modules, judges and processes information inside, then pushes the article identification to be sent to the control terminal, and the control terminal controls the output of the article information. And the control terminal stores a prefabricated article 3D model, obtains a playing instruction to be carried out by analyzing the received Wifi command data packet, has special effects of playing pause, animation switching and the like, and carries out article display of the 3D playing terminal. On the other hand, commodity introduction information is output to the display large screen through the HDMI signal on the control terminal, and in the mode, which article is picked up by a customer can be judged automatically according to the article picked up by the customer, and the introduction video of the commodity can be played according to the inclination angle.
Optionally, when the object is on the exhibition stand, the touch sensor does not detect that the object is moved, and the bluetooth unit is in a dormant state to save battery power;
Optionally, when the sensor of the corresponding article is triggered, the article identifier is sent to the control terminal through the configured bluetooth or Wifi module.
Optionally, when the control terminal receives the corresponding item identifier, comparing the item identifier with the item identifier stored in the control terminal in advance, finding out the item with the same item identifier, and then sending an output command of item information corresponding to the item identifier to the display to display text or image video. And the method can also send an output command of the article information corresponding to the article identifier to the naked eye 3D playing terminal for displaying the 3D model, and simultaneously output the detailed parameter information of the article to the display screen.
With continued reference to fig. 3, a flow 300 of one embodiment of a method for outputting information in accordance with the present application is shown. The method for outputting information comprises the following steps:
Step 301, detecting whether the time for which the item to be displayed is blocked exceeds a predetermined time threshold.
In this embodiment, an execution subject of the method for outputting information (e.g., the sensor shown in fig. 1) may detect whether or not the time for which the object to be displayed is blocked exceeds a predetermined time threshold by means of a built-in proximity sensor. Detecting whether an obstacle exists in a preset range near the article through infrared rays, and if the obstacle exists, the obstacle is considered to be blocked. And if the blocked time exceeds a preset time threshold value, triggering to send a message to the control terminal.
If yes, step 302, sending an article identifier of the article.
In this embodiment, the sensors are mounted on each item to be displayed, so that each sensor has been bound to an item identification. When the sensor detects that the triggering condition is met (the article is blocked beyond a predetermined time threshold or is moved), an article identification of the article is sent to the control terminal.
Step 303, it is detected whether the position information of the article has changed.
In this embodiment, the object to be displayed may be detected to be moved by a built-in touch sensor. The acceleration sensor in the touch sensor may acquire XYZ axis data to determine whether the position of the article is changed, i.e., whether it is moving. The movement may be translational only or may be rotational.
And 304, if the change occurs, sending the article identification and the position change information of the article.
In this embodiment, if the position of the article changes, it is explained that the user picks up the article and is interested in the article, so that the control terminal needs to be notified of the article identification of the article for outputting the article information of the article. In addition, in order to display the articles at multiple angles, the position change information can be sent to the control terminal, so that the control terminal can select the angles of the outputted article images and videos according to the orientation of the current articles. For example, if the item is flipped over, an image, video, or text of the bottom of the item, or details of the bottom, is output.
In some optional implementations of the present embodiment, the method further includes: and in response to receiving the output state of the article information, illuminating the LED lamp according to the output state. The control terminal may feed back an output state, e.g., being output, queuing output, etc., to the sensor when outputting the item information. If there is no feedback, indicating that the control terminal did not receive a command for indicating output of item information including the item identification, the sensor may retransmit the item identification.
With continued reference to fig. 4, a flow 400 of one embodiment of a method for outputting information in accordance with the present application is shown. The method for outputting information comprises the following steps:
Step 401, receiving an item identification and location change information of an item to be displayed.
In this embodiment, the execution body (for example, the control terminal shown in fig. 1) of the method for outputting information may receive, by a wireless connection, the item identification and the position change information of the item to be displayed, which are sent by the sensor. Alternatively the sensor may only send the item identification without the change in position information, which may be considered that no change in position has occurred in the item. It should be noted that the wireless connection may include, but is not limited to, 3G/4G connection, wifi connection, bluetooth connection, wiMAX connection, zigbee connection, UWB (ultra wideband) connection, and other now known or later developed wireless connection.
Step 402, searching article information corresponding to the article identifier from a preset article information base.
In this embodiment, the article information base stores correspondence between article information and article identifiers of various articles. The item information may include at least one of: text information, video information, image information, and naked eye 3D models. The item information may be indexed by item identification.
Step 403, outputting the article information according to the position change information.
In this embodiment, the item information may change in real time with the location of the item. For example, when the sensor detects that the article is turned upside down, the sensor transmits the position change information to the control terminal. After the control terminal searches the information of the article, the information related to the bottom of the article is output, for example, the detailed text introduction of the bottom and the image or video of the bottom. And synchronously adjusting the naked eye 3D model according to the position change information and then sending the position change information to a naked eye 3D player for playing.
With continued reference to fig. 5, fig. 5 is a schematic diagram of an application scenario of the method for outputting information according to the present embodiment. In the application scenario of fig. 5, the sensor performs step 501 to detect that the item to be displayed is moved or blocked, and then performs step 502 to send the item identification of the item to the control terminal. The control terminal performs step 503 to find the item information corresponding to the item identifier. The control terminal then performs step 504 to output the item information to the display. The display performs step 505 to display item information. If the naked eye 3D model is included in the article information, the naked eye 3D model can be sent to a naked eye 3D player for display.
According to the method provided by the embodiment of the application, the state change of the article detected by the sensor is transmitted to the control terminal, and the control terminal is triggered to find out the article information and output the article information. The method realizes the order of the user to the article information. And the advertising effect is improved.
With further reference to fig. 6, as an implementation of the method shown in the above figures, the present application provides an embodiment of an apparatus for outputting information, which corresponds to the method embodiment shown in fig. 3, and which is particularly applicable to various electronic devices.
As shown in fig. 6, the apparatus 600 for outputting information of the present embodiment includes: a proximity sensing unit 601, a first transmitting unit 602, a touch sensing unit 603, and a second transmitting unit 604. Wherein the proximity sensing unit 601 is configured to detect whether a time when an item to be displayed is blocked exceeds a predetermined time threshold; the first sending unit 602 is configured to send an item identification of the item if the item is blocked beyond a predetermined time threshold; the touch sensing unit 603 is configured to detect whether or not the positional information of the article is changed; the second transmitting unit 604 is configured to transmit the item identification and the position change information of the item if the position information of the item is changed.
In this embodiment, specific processes of the proximity sensing unit 601, the first transmitting unit 602, the touch sensing unit 603, and the second transmitting unit 604 of the apparatus 600 for outputting information may refer to steps 301, 302, 303, 304 in the corresponding embodiment of fig. 3.
With further reference to fig. 7, as an implementation of the method shown in the above figures, the present application provides an embodiment of an apparatus for outputting information, which corresponds to the method embodiment shown in fig. 4, and which is particularly applicable to various electronic devices.
As shown in fig. 7, the apparatus 700 for outputting information of the present embodiment includes: a receiving unit 701, a searching unit 702 and an output unit 703. Wherein the receiving unit 701 is configured to receive item identification and location change information of an item to be displayed; the searching unit 702 is configured to search for item information corresponding to the item identifier from a preset item information base; the output unit 703 is configured to output item information according to the position change information.
Referring now to FIG. 8, there is illustrated a schematic diagram of a computer system 800 suitable for use in implementing an electronic device (e.g., the control terminal shown in FIG. 1) in accordance with an embodiment of the present application. The electronic device shown in fig. 8 is only an example and should not be construed as limiting the functionality and scope of use of the embodiments of the application.
As shown in fig. 8, the computer system 800 includes a Central Processing Unit (CPU) 801 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 802 or a program loaded from a storage section 808 into a Random Access Memory (RAM) 803. In the RAM803, various programs and data required for the operation of the system 800 are also stored. The CPU 801, ROM 802, and RAM803 are connected to each other by a bus 804. An input/output (I/O) interface 805 is also connected to the bus 804.
The following components are connected to the I/O interface 805: an input portion 806 including a keyboard, mouse, etc.; an output portion 807 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and a speaker; a storage section 808 including a hard disk or the like; and a communication section 809 including a network interface card such as a LAN card, a modem, or the like. The communication section 809 performs communication processing via a network such as the internet. The drive 810 is also connected to the I/O interface 805 as needed. A removable medium 811 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 810 as needed so that a computer program read out therefrom is mounted into the storage section 808 as needed.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flowcharts. In such an embodiment, the computer program may be downloaded and installed from a network via the communication section 809, and/or installed from the removable media 811. The above-described functions defined in the method of the present application are performed when the computer program is executed by a Central Processing Unit (CPU) 801. The computer readable medium according to the present application may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present application, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present application may be implemented in software or in hardware. The described units may also be provided in a processor, for example, described as: a processor includes a receiving unit, a lookup unit, and an output unit. The names of these units do not constitute a limitation on the unit itself in some cases, and for example, the receiving unit may also be described as "a unit that receives item identification and position change information of an item to be displayed".
As another aspect, the present application also provides a computer-readable medium that may be contained in the apparatus described in the above embodiments; or may be present alone without being fitted into the device. The computer readable medium carries one or more programs which, when executed by the apparatus, cause the apparatus to receive item identification and location change information for an item to be displayed; searching article information corresponding to the article identifier from a preset article information base; and outputting the article information according to the position change information.
The above description is only illustrative of the preferred embodiments of the present application and of the principles of the technology employed. It will be appreciated by persons skilled in the art that the scope of the application referred to in the present application is not limited to the specific combinations of the technical features described above, but also covers other technical features formed by any combination of the technical features described above or their equivalents without departing from the inventive concept. Such as the above-mentioned features and the technical features disclosed in the present application (but not limited to) having similar functions are replaced with each other.

Claims (14)

1. A system for outputting information, comprising:
The sensor is configured to send an article identifier of the article to the control terminal in response to detection that the article to be displayed is moved or blocked, and send position change information of the article to the control terminal in response to detection of position change of the article by continuously acquiring XYZ three-axis data fusion, comparison and analysis through the acceleration sensor;
The control terminal is configured to respond to the receiving of the article identification sent by the sensor, search article information corresponding to the article identification from a preset article information base, output the article information to a display, respond to the receiving of the article identification and the position change information sent by the sensor, search the article information corresponding to the article identification from the preset article information base, output the article information to the display according to the position change information, synchronously adjust a naked eye 3D model according to the position change information along with the actual position change of the article, and then send the naked eye 3D model to a naked eye 3D player for playing, and if the position change information is turned to be upward, output information related to the bottom of the article;
A display configured to display the item information.
2. The system of claim 1, wherein the item information comprises at least one of: text information, video information, image information, and naked eye 3D models.
3. The system of claim 2, wherein the system further comprises:
And the naked eye 3D player is configured to play the naked eye 3D model of the article.
4. The system of claim 1, wherein the inductor comprises at least one of:
A proximity sensor configured to send an item identification of the item to the control terminal in response to detecting that the item is obscured beyond a predetermined time threshold;
and a touch sensor configured to send an item identification and position change information of the item to the control terminal in response to detecting that the item is moved.
5. The system of one of claims 1-4, wherein the control terminal is further configured to:
and sending the output state of the article information to the sensor.
6. The system of claim 5, wherein the inductor further comprises:
and the LED lamp group is configured to respond to the receiving of the output state and light the LED lamp according to the output state.
7. A method for outputting information for use in an inductor in a system according to any one of claims 1-6, the method comprising:
Detecting whether the time for which the object to be displayed is shielded exceeds a preset time threshold;
If yes, sending an article identifier of the article;
detecting whether the position information of the article changes;
and if the article changes, sending the article identification and the position change information of the article.
8. The method of claim 7, wherein the method further comprises:
and in response to receiving the output state of the article information, lighting the LED lamp according to the output state.
9. A method for outputting information, applied to a control terminal in the system of any one of claims 1-6, the method comprising:
Receiving article identification and position change information of an article to be displayed;
searching article information corresponding to the article identifier from a preset article information base;
and outputting the article information according to the position change information.
10. The method of claim 9, wherein the method further comprises:
and sending the output state of the article information.
11. An apparatus for outputting information for use in an inductor in the system of any one of claims 1-6, the apparatus comprising:
A proximity sensing unit configured to detect whether a time for which an item to be displayed is blocked exceeds a predetermined time threshold;
a first transmitting unit configured to transmit an item identification of the item if the item is blocked beyond the predetermined time threshold;
a touch sensing unit configured to detect whether or not position information of the article is changed;
And the second sending unit is configured to send the article identification and the position change information of the article if the position information of the article changes.
12. An apparatus for outputting information, for use in a control terminal in a system according to any one of claims 1-6, the apparatus comprising:
a receiving unit configured to receive article identification and position change information of an article to be displayed;
The searching unit is configured to search article information corresponding to the article identifier from a preset article information base;
and an output unit configured to output the article information according to the position change information.
13. An electronic device, comprising:
One or more processors;
a storage device having one or more programs stored thereon,
When executed by the one or more processors, causes the one or more processors to implement the method of any of claims 7-10.
14. A computer readable medium having stored thereon a computer program, wherein the program when executed by a processor implements the method according to one of claims 7-10.
CN201811306145.9A 2018-11-05 2018-11-05 System, method and apparatus for outputting information Active CN111144914B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811306145.9A CN111144914B (en) 2018-11-05 2018-11-05 System, method and apparatus for outputting information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811306145.9A CN111144914B (en) 2018-11-05 2018-11-05 System, method and apparatus for outputting information

Publications (2)

Publication Number Publication Date
CN111144914A CN111144914A (en) 2020-05-12
CN111144914B true CN111144914B (en) 2024-06-18

Family

ID=70515163

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811306145.9A Active CN111144914B (en) 2018-11-05 2018-11-05 System, method and apparatus for outputting information

Country Status (1)

Country Link
CN (1) CN111144914B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107392737A (en) * 2017-08-28 2017-11-24 北京京东金融科技控股有限公司 The method and terminal of Item Information displaying

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101461186B1 (en) * 2011-07-07 2014-11-13 엘지디스플레이 주식회사 Stereoscopic image display device and driving method the same
CN104301708B (en) * 2014-10-22 2017-02-22 小米科技有限责任公司 3D display method, device and terminal
EP3088991B1 (en) * 2015-04-30 2019-12-25 TP Vision Holding B.V. Wearable device and method for enabling user interaction
US20170161769A1 (en) * 2015-12-03 2017-06-08 Capital One Services, Llc Methods and Systems for Generating Offers Based on Real-Time Data Collected By a Location-Detecting Network
CN105353512B (en) * 2015-12-10 2018-06-29 联想(北京)有限公司 A kind of method for displaying image and image display device
CN105718588A (en) * 2016-01-26 2016-06-29 北京行云时空科技有限公司 Space-time log automatic generation method and system based on 3D glasses
CN107391750B (en) * 2017-08-15 2020-09-25 北京百度网讯科技有限公司 Method and apparatus for processing information
CN107577807B (en) * 2017-09-26 2020-11-10 百度在线网络技术(北京)有限公司 Method and device for pushing information

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107392737A (en) * 2017-08-28 2017-11-24 北京京东金融科技控股有限公司 The method and terminal of Item Information displaying

Also Published As

Publication number Publication date
CN111144914A (en) 2020-05-12

Similar Documents

Publication Publication Date Title
EP3028008B1 (en) Methods and apparatus for determining the orientation of a mobile phone in an indoor environment
KR102447438B1 (en) Alarm device and method for informing location of objects thereof
US9196094B2 (en) Method and apparatus for augmented reality
US9473594B1 (en) Projection of interactive map data
US11720179B1 (en) System and method for redirecting content based on gestures
EP3427233B1 (en) Method and apparatus for providing augmented reality services
US9374159B2 (en) Reception display apparatus, information transmission apparatus, optical wireless communication system, reception display integrated circuit, information transmission integrated circuit, reception display program, information transmission program, and optical wireless communication method
US9549101B1 (en) Image capture enhancement using dynamic control image
US11170342B1 (en) Item identification and guidance system and method
US20190090295A1 (en) Mobile device and method for establishing a wireless link
CN109507904B (en) Household equipment management method, server and management system
CN104113742A (en) Image Display Unit, Mobile Phone And Method
CN111144914B (en) System, method and apparatus for outputting information
CN109166257B (en) Shopping cart commodity verification method and device thereof
US11657574B2 (en) Systems and methods for providing an audio-guided virtual reality tour
US20140002646A1 (en) Bottom of the basket surveillance system for shopping carts
US10963097B2 (en) Method, electronic device, and apparatus for touch-region calibration
CN110659848A (en) Method and system for monitoring object
US10345965B1 (en) Systems and methods for providing an interactive user interface using a film, visual projector, and infrared projector
US11861755B2 (en) Customer signaling location beacon
CN112333494B (en) Method and device for acquiring article information and electronic equipment
KR102516278B1 (en) User terminal for providing intuitive control environment on media pannel, server, and display apparatus
US20240331310A1 (en) Augmented reality guidance in a physical location
WO2022163651A1 (en) Information processing system
US12100194B1 (en) Image enhancement

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: Room 221, 2nd floor, Block C, 18 Kechuang 11th Street, Daxing Economic and Technological Development Zone, Beijing, 100176

Applicant after: Jingdong Technology Holding Co.,Ltd.

Address before: Room 221, 2nd floor, Block C, 18 Kechuang 11th Street, Daxing Economic and Technological Development Zone, Beijing, 100176

Applicant before: Jingdong Digital Technology Holding Co.,Ltd.

Address after: Room 221, 2nd floor, Block C, 18 Kechuang 11th Street, Daxing Economic and Technological Development Zone, Beijing, 100176

Applicant after: Jingdong Digital Technology Holding Co.,Ltd.

Address before: Room 221, 2nd floor, Block C, 18 Kechuang 11th Street, Daxing Economic and Technological Development Zone, Beijing, 100176

Applicant before: JINGDONG DIGITAL TECHNOLOGY HOLDINGS Co.,Ltd.

GR01 Patent grant
GR01 Patent grant