Disclosure of Invention
The embodiment of the application provides a system, a method and a device for outputting information.
In a first aspect, an embodiment of the present application provides a system for outputting information, including: an inductor configured to send an item identification of the item to the control terminal in response to detecting that the item to be displayed is moved or obscured; the control terminal is configured to respond to the received article identification sent by the sensor, search article information corresponding to the article identification from a preset article information base and output the article information to the display; and a display configured to display the item information.
In some embodiments, the sensor is further configured to send location change information of the item to the control terminal in response to detecting a location change of the item; and the control terminal is further configured to: and in response to receiving the article identification and the position change information sent by the sensor, searching article information corresponding to the article identification from a preset article information base, and outputting the article information to a display according to the position change information.
In some embodiments, the item information includes at least one of: text information, video information, image information, and naked eye 3D models.
In some embodiments, the system further comprises: the naked eye 3D player is configured to play a naked eye 3D model of the object.
In some embodiments, the inductor comprises at least one of: a proximity sensor configured to send an item identification of the item to the control terminal in response to detecting that the item is obscured beyond a predetermined time threshold; and a touch sensor configured to transmit an item identification and location change information of the item to the control terminal in response to detecting that the item is moved.
In some embodiments, the control terminal is further configured to: and sending the output state of the article information to the sensor.
In some embodiments, the inductor further comprises: and an LED lamp group configured to light the LED lamp according to the output state in response to receiving the output state.
In a second aspect, an embodiment of the present application provides a method for outputting information, including: receiving article identification and position change information of an article to be displayed; searching article information corresponding to the article identifier from a preset article information base; and outputting the article information according to the position change information.
In some embodiments, the method further comprises: and transmitting the output state of the article information.
In a third aspect, an embodiment of the present application provides a method for outputting information, including: detecting whether the time for which the object to be displayed is shielded exceeds a preset time threshold; if yes, sending an article identifier of the article; detecting whether the position information of the article changes; and if the change occurs, sending the article identification and the position change information of the article.
In some embodiments, the method further comprises: and in response to receiving the output state of the article information, illuminating the LED lamp according to the output state.
In a fourth aspect, an embodiment of the present application provides an apparatus for outputting information, including: a receiving unit configured to receive article identification and position change information of an article to be displayed; the searching unit is configured to search article information corresponding to the article identifier from a preset article information base; and an output unit configured to output the item information according to the position change information.
In a fifth aspect, an embodiment of the present application provides an apparatus for outputting information, including: a proximity sensing unit configured to detect whether a time for which an item to be displayed is blocked exceeds a predetermined time threshold; a first transmitting unit configured to transmit an item identification of the item if the item is blocked beyond a predetermined time threshold; a touch sensing unit configured to detect whether or not position information of an article is changed; and a second transmitting unit configured to transmit the article identification and the position change information of the article if the position information of the article is changed.
In a sixth aspect, an embodiment of the present application provides an electronic device, including: one or more processors; and a storage device having one or more programs stored thereon, which when executed by the one or more processors cause the one or more processors to implement the method as one of the second or third aspects.
In a seventh aspect, an embodiment of the present application provides a computer readable medium having stored thereon a computer program, wherein the program when executed by a processor implements a method as in one of the second or third aspects.
The system, the method and the device for outputting information provided by the embodiment of the application trigger the display to output the article information of the article by detecting whether the article is moved or blocked. So that information of the object of interest to the user can be displayed according to the user's demand.
Detailed Description
The application is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting of the application. It should be noted that, for convenience of description, only the portions related to the present application are shown in the drawings.
It should be noted that, without conflict, the embodiments of the present application and features of the embodiments may be combined with each other. The application will be described in detail below with reference to the drawings in connection with embodiments.
Fig. 1 illustrates an exemplary system architecture 100 for a system for outputting information to which the present application may be applied.
As shown in fig. 1, in this embodiment, the system includes:
The sensor 101 is configured to send an item identification of the item to the control terminal in response to detecting that the item to be displayed is moved or obscured. The sensor is mounted over the item to be displayed. Each article has its own sensor which is bound to the article identification. The sensor is in communication connection with the control terminal in a wireless manner (e.g., bluetooth, WIFI, etc.). The user's stay in the vicinity of which item for a longer time or move the item indicates that the user is interested in the item. And the sensor sends an article identification of the article to the control terminal when detecting that the article is blocked beyond a preset time threshold value, so that the control terminal outputs article information of the article.
The control terminal 102 is configured to respond to the received article identification sent by the sensor, search article information corresponding to the article identification from a preset article information base, and output the article information to the display. The article information base stores the correspondence between the article information of various articles and the article identification. The item information may include at least one of: text information, video information, image information, and naked eye 3D models. The item information may be indexed by item identification.
A display 103 configured to display item information. The display may be a liquid crystal display or a CRT. But also a display for displaying 3D images.
In some optional implementations of the present embodiment, the system may further include a naked eye 3D player 104 configured to play a naked eye 3D model of the item.
As shown in fig. 1, the core parts of the system are a control terminal and an induction controller. The control terminal is a main control unit of the whole system, wherein a Wifi communication unit, a Bluetooth receiving unit, an HDMI audio/video output unit, a power supply unit and the like are configured, and the control terminal is responsible for acquiring, analyzing and sending equipment messages. The sensor may include a proximity sensor and a touch sensor, or may include only one of them. The proximity sensor may include an infrared sensor unit, a Wifi communication unit, a power supply unit, and the like. The touch sensor may include a bluetooth unit, a motion detection unit, a power supply unit, etc., and the structure diagram of the specific intelligent hardware display system is shown in fig. 2.
The main control unit in fig. 2 integrates Wifi and bluetooth modules as channels for communication with the touch sensor and the proximity sensor and the control terminal. The proximity sensor in the sensor is an infrared ranging module, whether the user approaches the object is judged by capturing the returned distance parameter in real time, and if the user is within a preset distance range, timing is started. Meanwhile, the number of the lightened LED lamps can be increased along with timing. And when the trigger timing is greater than the set value, sending the article identification of the article bound by the sensor to the control terminal through the Wifi module. The touch sensor can be internally provided with an acceleration sensor and a Bluetooth module, the acceleration sensor continuously acquires state information of whether the XYZ triaxial data are fused, compared and separated, and the state information is broadcasted and sent to the main control unit through the Bluetooth module, so that the triggering state of the sensor is judged. And if the main control unit judges that the object is moved, the object identification is sent to the control terminal.
In some optional implementations of this embodiment, the sensor is further configured to send information of the change in position of the item to the control terminal in response to detecting the change in position of the item, and the control terminal is further configured to: and in response to receiving the article identification and the position change information sent by the sensor, searching article information corresponding to the article identification from a preset article information base, and outputting the article information to a display according to the position change information. The item information may change in real time with the location of the item. For example, when the sensor detects that the article is turned upside down, the sensor transmits the position change information to the control terminal. After the control terminal searches the information of the article, the information related to the bottom of the article is output, for example, the detailed text introduction of the bottom and the image or video of the bottom. And synchronously adjusting the naked eye 3D model according to the position change information along with the actual position change of the object, and then sending the position change information to a naked eye 3D player for playing.
In some optional implementations of this embodiment, the control terminal is further configured to: and sending the output state of the article information to the sensor. The inductor further comprises: and an LED lamp group configured to light the LED lamp according to the output state in response to receiving the output state. The output state may be playing, waiting to play, etc., and may be represented by LED lights of different colors.
The integrated optimization design can be carried out on each unit, so that the volume is reduced and the power consumption is reduced. The main control unit continuously monitors the output state returned by the Wifi and Bluetooth modules, judges and processes information inside, then pushes the article identification to be sent to the control terminal, and the control terminal controls the output of the article information. And the control terminal stores a prefabricated article 3D model, obtains a playing instruction to be carried out by analyzing the received Wifi command data packet, has special effects of playing pause, animation switching and the like, and carries out article display of the 3D playing terminal. On the other hand, commodity introduction information is output to the display large screen through the HDMI signal on the control terminal, and in the mode, which article is picked up by a customer can be judged automatically according to the article picked up by the customer, and the introduction video of the commodity can be played according to the inclination angle.
Optionally, when the object is on the exhibition stand, the touch sensor does not detect that the object is moved, and the bluetooth unit is in a dormant state to save battery power;
Optionally, when the sensor of the corresponding article is triggered, the article identifier is sent to the control terminal through the configured bluetooth or Wifi module.
Optionally, when the control terminal receives the corresponding item identifier, comparing the item identifier with the item identifier stored in the control terminal in advance, finding out the item with the same item identifier, and then sending an output command of item information corresponding to the item identifier to the display to display text or image video. And the method can also send an output command of the article information corresponding to the article identifier to the naked eye 3D playing terminal for displaying the 3D model, and simultaneously output the detailed parameter information of the article to the display screen.
With continued reference to fig. 3, a flow 300 of one embodiment of a method for outputting information in accordance with the present application is shown. The method for outputting information comprises the following steps:
Step 301, detecting whether the time for which the item to be displayed is blocked exceeds a predetermined time threshold.
In this embodiment, an execution subject of the method for outputting information (e.g., the sensor shown in fig. 1) may detect whether or not the time for which the object to be displayed is blocked exceeds a predetermined time threshold by means of a built-in proximity sensor. Detecting whether an obstacle exists in a preset range near the article through infrared rays, and if the obstacle exists, the obstacle is considered to be blocked. And if the blocked time exceeds a preset time threshold value, triggering to send a message to the control terminal.
If yes, step 302, sending an article identifier of the article.
In this embodiment, the sensors are mounted on each item to be displayed, so that each sensor has been bound to an item identification. When the sensor detects that the triggering condition is met (the article is blocked beyond a predetermined time threshold or is moved), an article identification of the article is sent to the control terminal.
Step 303, it is detected whether the position information of the article has changed.
In this embodiment, the object to be displayed may be detected to be moved by a built-in touch sensor. The acceleration sensor in the touch sensor may acquire XYZ axis data to determine whether the position of the article is changed, i.e., whether it is moving. The movement may be translational only or may be rotational.
And 304, if the change occurs, sending the article identification and the position change information of the article.
In this embodiment, if the position of the article changes, it is explained that the user picks up the article and is interested in the article, so that the control terminal needs to be notified of the article identification of the article for outputting the article information of the article. In addition, in order to display the articles at multiple angles, the position change information can be sent to the control terminal, so that the control terminal can select the angles of the outputted article images and videos according to the orientation of the current articles. For example, if the item is flipped over, an image, video, or text of the bottom of the item, or details of the bottom, is output.
In some optional implementations of the present embodiment, the method further includes: and in response to receiving the output state of the article information, illuminating the LED lamp according to the output state. The control terminal may feed back an output state, e.g., being output, queuing output, etc., to the sensor when outputting the item information. If there is no feedback, indicating that the control terminal did not receive a command for indicating output of item information including the item identification, the sensor may retransmit the item identification.
With continued reference to fig. 4, a flow 400 of one embodiment of a method for outputting information in accordance with the present application is shown. The method for outputting information comprises the following steps:
Step 401, receiving an item identification and location change information of an item to be displayed.
In this embodiment, the execution body (for example, the control terminal shown in fig. 1) of the method for outputting information may receive, by a wireless connection, the item identification and the position change information of the item to be displayed, which are sent by the sensor. Alternatively the sensor may only send the item identification without the change in position information, which may be considered that no change in position has occurred in the item. It should be noted that the wireless connection may include, but is not limited to, 3G/4G connection, wifi connection, bluetooth connection, wiMAX connection, zigbee connection, UWB (ultra wideband) connection, and other now known or later developed wireless connection.
Step 402, searching article information corresponding to the article identifier from a preset article information base.
In this embodiment, the article information base stores correspondence between article information and article identifiers of various articles. The item information may include at least one of: text information, video information, image information, and naked eye 3D models. The item information may be indexed by item identification.
Step 403, outputting the article information according to the position change information.
In this embodiment, the item information may change in real time with the location of the item. For example, when the sensor detects that the article is turned upside down, the sensor transmits the position change information to the control terminal. After the control terminal searches the information of the article, the information related to the bottom of the article is output, for example, the detailed text introduction of the bottom and the image or video of the bottom. And synchronously adjusting the naked eye 3D model according to the position change information and then sending the position change information to a naked eye 3D player for playing.
With continued reference to fig. 5, fig. 5 is a schematic diagram of an application scenario of the method for outputting information according to the present embodiment. In the application scenario of fig. 5, the sensor performs step 501 to detect that the item to be displayed is moved or blocked, and then performs step 502 to send the item identification of the item to the control terminal. The control terminal performs step 503 to find the item information corresponding to the item identifier. The control terminal then performs step 504 to output the item information to the display. The display performs step 505 to display item information. If the naked eye 3D model is included in the article information, the naked eye 3D model can be sent to a naked eye 3D player for display.
According to the method provided by the embodiment of the application, the state change of the article detected by the sensor is transmitted to the control terminal, and the control terminal is triggered to find out the article information and output the article information. The method realizes the order of the user to the article information. And the advertising effect is improved.
With further reference to fig. 6, as an implementation of the method shown in the above figures, the present application provides an embodiment of an apparatus for outputting information, which corresponds to the method embodiment shown in fig. 3, and which is particularly applicable to various electronic devices.
As shown in fig. 6, the apparatus 600 for outputting information of the present embodiment includes: a proximity sensing unit 601, a first transmitting unit 602, a touch sensing unit 603, and a second transmitting unit 604. Wherein the proximity sensing unit 601 is configured to detect whether a time when an item to be displayed is blocked exceeds a predetermined time threshold; the first sending unit 602 is configured to send an item identification of the item if the item is blocked beyond a predetermined time threshold; the touch sensing unit 603 is configured to detect whether or not the positional information of the article is changed; the second transmitting unit 604 is configured to transmit the item identification and the position change information of the item if the position information of the item is changed.
In this embodiment, specific processes of the proximity sensing unit 601, the first transmitting unit 602, the touch sensing unit 603, and the second transmitting unit 604 of the apparatus 600 for outputting information may refer to steps 301, 302, 303, 304 in the corresponding embodiment of fig. 3.
With further reference to fig. 7, as an implementation of the method shown in the above figures, the present application provides an embodiment of an apparatus for outputting information, which corresponds to the method embodiment shown in fig. 4, and which is particularly applicable to various electronic devices.
As shown in fig. 7, the apparatus 700 for outputting information of the present embodiment includes: a receiving unit 701, a searching unit 702 and an output unit 703. Wherein the receiving unit 701 is configured to receive item identification and location change information of an item to be displayed; the searching unit 702 is configured to search for item information corresponding to the item identifier from a preset item information base; the output unit 703 is configured to output item information according to the position change information.
Referring now to FIG. 8, there is illustrated a schematic diagram of a computer system 800 suitable for use in implementing an electronic device (e.g., the control terminal shown in FIG. 1) in accordance with an embodiment of the present application. The electronic device shown in fig. 8 is only an example and should not be construed as limiting the functionality and scope of use of the embodiments of the application.
As shown in fig. 8, the computer system 800 includes a Central Processing Unit (CPU) 801 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 802 or a program loaded from a storage section 808 into a Random Access Memory (RAM) 803. In the RAM803, various programs and data required for the operation of the system 800 are also stored. The CPU 801, ROM 802, and RAM803 are connected to each other by a bus 804. An input/output (I/O) interface 805 is also connected to the bus 804.
The following components are connected to the I/O interface 805: an input portion 806 including a keyboard, mouse, etc.; an output portion 807 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and a speaker; a storage section 808 including a hard disk or the like; and a communication section 809 including a network interface card such as a LAN card, a modem, or the like. The communication section 809 performs communication processing via a network such as the internet. The drive 810 is also connected to the I/O interface 805 as needed. A removable medium 811 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 810 as needed so that a computer program read out therefrom is mounted into the storage section 808 as needed.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flowcharts. In such an embodiment, the computer program may be downloaded and installed from a network via the communication section 809, and/or installed from the removable media 811. The above-described functions defined in the method of the present application are performed when the computer program is executed by a Central Processing Unit (CPU) 801. The computer readable medium according to the present application may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present application, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present application may be implemented in software or in hardware. The described units may also be provided in a processor, for example, described as: a processor includes a receiving unit, a lookup unit, and an output unit. The names of these units do not constitute a limitation on the unit itself in some cases, and for example, the receiving unit may also be described as "a unit that receives item identification and position change information of an item to be displayed".
As another aspect, the present application also provides a computer-readable medium that may be contained in the apparatus described in the above embodiments; or may be present alone without being fitted into the device. The computer readable medium carries one or more programs which, when executed by the apparatus, cause the apparatus to receive item identification and location change information for an item to be displayed; searching article information corresponding to the article identifier from a preset article information base; and outputting the article information according to the position change information.
The above description is only illustrative of the preferred embodiments of the present application and of the principles of the technology employed. It will be appreciated by persons skilled in the art that the scope of the application referred to in the present application is not limited to the specific combinations of the technical features described above, but also covers other technical features formed by any combination of the technical features described above or their equivalents without departing from the inventive concept. Such as the above-mentioned features and the technical features disclosed in the present application (but not limited to) having similar functions are replaced with each other.