WO2011088670A1 - Interactive information system, interactive information method, and computer readable medium thereof - Google Patents
Interactive information system, interactive information method, and computer readable medium thereof Download PDFInfo
- Publication number
- WO2011088670A1 WO2011088670A1 PCT/CN2010/075220 CN2010075220W WO2011088670A1 WO 2011088670 A1 WO2011088670 A1 WO 2011088670A1 CN 2010075220 W CN2010075220 W CN 2010075220W WO 2011088670 A1 WO2011088670 A1 WO 2011088670A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- processing unit
- gadget
- enabling
- file
- Prior art date
Links
- 230000002452 interceptive effect Effects 0.000 title claims abstract description 110
- 238000000034 method Methods 0.000 title claims abstract description 45
- 238000012545 processing Methods 0.000 claims abstract description 161
- 230000005540 biological transmission Effects 0.000 claims abstract description 99
- 230000004913 activation Effects 0.000 claims description 23
- 239000000463 material Substances 0.000 claims description 13
- 230000006870 function Effects 0.000 description 11
- 230000003993 interaction Effects 0.000 description 7
- 238000004891 communication Methods 0.000 description 6
- 238000004590 computer program Methods 0.000 description 3
- 235000012054 meals Nutrition 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000008054 signal transmission Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B5/00—Electrically-operated educational appliances
- G09B5/06—Electrically-operated educational appliances with both visual and audible presentation of the material to be studied
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
- G06Q50/20—Education
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B19/00—Teaching not covered by other main groups of this subclass
- G09B19/06—Foreign languages
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B7/00—Electrically-operated teaching apparatus or devices working with questions and answers
Definitions
- the present invention relates to an interactive information system, an interactive information method, and a computer readable medium thereof. More particularly, the interactive information system, the interactive information method, and the computer readable medium thereof of the present invention read the identification code on information gadgets and output a file corresponding to the information gadget.
- An objective of this invention is to provide an interactive information system.
- the interactive information system comprises a first transmission interface, a storage unit, a processing unit, and a second transmission interface.
- the processing unit is electrically connected to the first transmission interface, the storage unit, and the second transmission interface.
- the storage unit stores a database.
- the first transmission interface is configured to receive a signal.
- the processing unit is configured to receive the signal from the first transmission interface, to identify an information gadget indicated by the signal, and to retrieve a file according to the information gadget from the database.
- the second transmission interface is electrically connected to the processing unit and configured to transmit the file.
- the interactive information system comprises a first transmission interface, a storage unit, a processing unit, and a second transmission interface.
- the storage unit stores a database.
- the processing unit is electrically connected to the first transmission interface and the storage unit.
- the second transmission interface is electrically connected to the processing unit.
- the interactive information method comprises the following steps of: (a) enabling the first transmission interface to receive a signal; (b) enabling the processing unit to receive the signal from the first transmission interface; (c) enabling the processing unit to identify an information gadget indicated by the signal; (d) enabling the processing unit to retrieve a file according to the information gadget from the database; and (e) enabling the second transmission interface to transmit the file.
- the computer readable medium is stored a program of an interactive information method for being applied in an interactive information system
- the interactive information system comprises a first transmission interface, a storage unit, a processing unit, a second transmission interface, and an output apparatus.
- the storage unit stores a database.
- the processing unit is electrically connected to the first transmission interface and the storage unit.
- the second transmission interface is electrically connected to the processing unit.
- the output apparatus is electrically connected to the second transmission interface.
- the program comprises a code A, a code B, a code C, a code E, a code F, and a code G
- the code A enables the first transmission interface to receive a signal.
- the code B enables the processing unit to receive the signal from the first transmission interface.
- the code C enables the processing unit to identify a first information gadget indicated by the signal.
- the code D enables the processing unit to identify a second information gadget indicated by the signal.
- the code E enables the processing unit to determine that a number of the information gadgets is greater than one.
- the code F enables the processing unit to retrieve a file according to the first information gadget from the database.
- the code G enables the second transmission interface to transmit the file to the output apparatus.
- the interactive information system, the interactive information method, and the computer readable medium thereof of the present invention are able to retrieve a file corresponding to the information gadget(s) from the database and to output the file.
- the present invention could overcome the drawbacks of the prior art (i.e. the conventional information exchange systems lack interactivity and interest) and increase interactivity and interest in information exchange applications.
- FIG. 1A illustrates a schematic view of a first embodiment of the present invention
- FIGs. IB and 1C illustrate schematic view of a cubic information gadget
- FIGs. ID, IE, and IF individually illustrate schematic view of an information gadget;
- FIG. 1G illustrates a schematic view of a second embodiment of the present invention.
- FIG. 1H illustrates an example of the second embodiment of the present invention
- FIG. 2 illustrates a schematic view of a third embodiment of the present invention
- FIGs. 3A-3E are the flowcharts of a fourth embodiment of the present invention.
- the present invention provides an interactive information system, an interactive information method and a computer readable medium thereof to assist users to learn and/or understand different fields of knowledge.
- the interactive information system may be used with a plurality of information gadgets, and each of the information gadgets is recorded with information of a specific field of knowledge.
- a user may choose one or some of the information gadgets and show the chosen information gadgets to interactive information system.
- the interactive information system then functions according to the chosen information gadgets From the functions performed by the interactive information system, the user can learn the specific field of knowledge from the chosen information gadgets. Consequently, the interactive information system can be used as a learning system.
- the interactive information system can be used for Chinese learning when the information gadgets are recorded with information of Chinese, for math learning when the information gadgets are recorded with information of math, for games playing when the information gadgets are recorded with information of games, and so on.
- a first embodiment of the present invention is an interactive information system 1, whose schematic view is illustrated in FIG. 1A.
- the interactive information system 1 comprises a reading apparatus 11, a first transmission interface 12, a processing unit 13, a storage unit 14, a second transmission interface 15, and an output apparatus 16.
- the processing unit 13 is electrically connected to the reading apparatus 11 via the first transmission interface 12 and is electrically connected to the output apparatus 16 via the second transmission interface 15.
- the processing unit 13 is electrically connected to the first transmission interface 12, the second transmission interface 15, and the storage unit 14.
- the storage unit 14 stores a database.
- FIG. 1A also illustrates a plurality of information gadgets 111, 112, 113, ..., 114 which are used with the interactive information system 1.
- the interactive information system 1 is a computer
- the reading apparatus 11 is a webcam
- the output apparatus 16 comprises a display unit and a speaker; however, these elements may be replaced by other devices in other embodiments.
- the reading apparatus 11 may be replaced by a barcode reader or other apparatuses capable of reading the identifier of the information gadgets 111, 112, 113, 114
- the interactive information system 1 may be replaced by a mobile phone, a personal digital assistant, or other computing apparatus that are of computer processing ability
- the output apparatus 16 may be replaced by a monitor, an LCD display, headphones, or other apparatus capable of outputting information.
- the reading apparatus 11 and the output apparatus 16 are build-in apparatuses of the interactive information system 1; however, in other embodiments, the reading apparatus 11 or the output apparatus 16 may be an external apparatus connected to a computer. It should be appreciated herein that, the present invention is not limited the types of the reading apparatus 11 or the output apparatus 16; the reading apparatus 11 or the output apparatus 16 may also be replaced in other manners by those of ordinary skill in the art depending on practical needs, and this will not be further described herein.
- an information gadget can be in any shape; that is, an information gadget may be a cube, a card, etc.
- An information gadget has an identifier recognizable by an interactive information system, which means that the identifier may be realized as a 2-dimentional (2D) barcode, a radio-frequency identification (RFID) tag, etc.
- the information gadget is recorded with content related to a specific field of knowledge, such as Chinese, mathematics, physics, games, etc. Exemplary information gadgets of the present invention are given below for better understanding.
- FIGs. IB and 1C illustrate schematic view of the information gadget 111, which is a cube.
- the identifier of the information gadget 111 is the 2D barcode 111a.
- the information gadget 111 can be used for Chinese learning because its content is related to Chinese language.
- the content recorded on the information gadget 111 comprises a simplified Chinese character "3 ⁇ 4" 111b, a traditional Chinese character "3 ⁇ 4" 111c, an English word "Love” llld, a picture representing love llle, and a phonetic alphabet lllf.
- FIGs. ID, IE, and IF illustrate another three information gadgets 112, 113, 114, which are cards.
- the information gadget 112 is recorded with a 2D barcode 112a representing its identifier, a traditional Chinese character “Rg” 112b, and an English word “Eat” 112c;
- the information gadget 113 is recorded with a 2D barcode 113a representing its identifier, a traditional Chinese character “fjjj” 113b, and an English word “Meal” 113c;
- the information gadget 114 is recorded with a 2D barcode 114a representing its identifier, a traditional Chinese phrase "3 ⁇ 43 ⁇ 4f" 114b, and an English word “pronounce” 114c.
- an information gadget may be further classified as a functional information gadget or a data information gadget depending on the contents recorded thereon.
- the information gadget if the content recorded on an information gadget corresponds to a piece of information, such as a Chinese character, the information gadget is classified as a data information gadget; if the content recorded on an information gadget corresponds to an instruction, the information gadget is classified as a functional information gadget.
- the information gadgets 111, 112, 113 are data information gadgets, while the information gadget 114 is a functional information gadget.
- the database stored in the storage unit 14 stores at least one corresponding file.
- Each interaction of the interactive information system 1 begins from placing one or more of the information gadgets 111, 112, 113, 114 in a reading range of the reading apparatus 11.
- the interactive information system 1 functions differently when the numbers of the placed information gadgets is one and when it is more than one.
- the reading apparatus 11 reads an identifier of the information gadget and generate a signal related to the identifier. Meanwhile, the reading apparatus 11 continuously captures an image in real time, wherein the image comprises the information gadget and a scene in front of the reading apparatus 11. Then, the reading apparatus 11 transmits the signal and the captured images to the processing unit 13 via the first transmission interface 12. After receiving the signal and the captured images from the first transmission interface 12, the processing unit 13 identifies the information gadget indicated by the signal. In particular, the processing unit 13 identifies the information gadget by identifying the identifier on the information gadget, generates an identification (ID) code representing the information gadget.
- ID identification
- the processing unit 13 checks whether the database stored in the storage unit 14 stores a file corresponding to the ID code. If the database stores a file corresponding to the ID code, the processing unit 13 will retrieve the file according to the ID code, and transmit the file and the captured images to the output apparatus 16 via the second transmission interface 15. The output apparatus 16 subsequently displays the captured images along with the file in a user interface.
- the file may be an image file, a video file, an audio file or a combination thereof which records a piece of information relating to a knowledge field.
- the reading apparatus 11 reads the 2D barcode 111a of the information gadget 111 and transmits a signal lllg representing the 2D barcode 111a to the first transmission interface 12.
- the reading apparatus 11 continuously captures an image 121 in real time and transmits the captured images 121 to the first transmission interface 12, wherein the image 121 comprises the information gadget 111 and a scene in front of the reading apparatus 11.
- the processing unit 13 receives the signal lllg and the captured images 121 via the first transmission interface 12. Meanwhile, the processing unit 13 determines that there is only one information gadget within the reading range of the reading apparatus 11.
- the processing unit 13 identifies an ID code lllh indicated by the signal lllg, retrieves a file llli (for example, a video file showing the stroke order of the traditional Chinese character "3 ⁇ 4" and a audio file showing the pronunciation of the traditional Chinese character "3 ⁇ 4") from the database stored in the storage unit 14 according to the ID code lllh, and transmits the file llli and the captured images 121 to the output apparatus 16 via the second transmission interface 15.
- the output apparatus 16 After receiving the file llli and the captured images 121 from the processing unit 13, the output apparatus 16 outputs the captured images 121 along with the file llli (i.e.
- the display unit of the output apparatus 16 displays the image of English word “Love”, and the speaker of the output apparatus 16 broadcasts the Chinese pronunciation which is "ai" of the Chinese character "3 ⁇ 4") in a user interface From the file llli shown on the output apparatus 16, the user learns the stroke order and the pronunciation of the traditional Chinese character "3 ⁇ 4".
- the processing unit 13 has to determine the number of groups formed by the placed information gadgets based on the distance between these placed information gadgets.
- the reading apparatus 11 reads the 2D barcodes of the placed information gadgets, and transmits signals that individually representing the 2D barcodes to the first transmission interface 12. Meanwhile, the reading apparatus 11 continuously captures an image in real time and transmits the captured images to the first transmission interface 12, wherein the image comprises the information gadgets and a scene in front of the reading apparatus 11.
- the processing unit 13 receives the signals and the captured images via the first transmission interface 12 and identifies the ID codes indicated by the signals. The processing unit 13 also determines that there are two information gadgets and the distance between the information gadgets is not shorter than the predetermined distance.
- the processing unit 13 treats them separately (i.e. treats them as two groups). Consequently, for each of the ID codes, the processing unit 13 retrieves a file from the database stored in the storage unit 14 and then transmits the files and the captured images to the output apparatus 16 via the second transmission interface 15. After receiving the files and the captured images from the second transmission interface 15, the output apparatus 16 outputs the files and the captured images.
- the processing unit 13 has to determine whether the group is a meaningful group or a meaningless group.
- the group formed by them is meaningful because the Chinese phrase formed by them has the meaning of "having a meal.”
- the information gadget 113 is placed at the right of the information gadget 112
- the group formed by them is meaningless. If the information gadget 112 is placed at the left of the information gadget 114, it is meaningful because there if one functional information gadget.
- the processing unit 13 determines that the distance between the information gadgets is shorter than the predetermined distance, the processing unit 13 will subsequently determine their order according to their relative positions.
- the processing unit 13 receives the signals representing the 2D barcodes of the information gadgets from the reading apparatus 11 and identifies the ID codes indicated by the signals. Afterwards, the processing unit 13 groups the two ID codes into a grouped ID code representing the combination of the information gadgets according to the order of the information gadgets, and determines whether the grouped ID code is a permissible grouped ID code. If the grouped ID code is a permissible grouped ID code, the group formed by the information gadgets is meaningful; otherwise, it is meaningless.
- the processing unit 13 determines whether the grouped ID code is a permissible grouped ID code. Specifically, after the processing unit 13 grouped two ID codes into a grouped ID code, the processing unit 13 will further check whether the database stored in the storage unit 14 stores a file corresponding to the grouped ID code. If yes, the processing unit 13 determines that the grouped ID code is a permissible grouped ID code. If the database does not store a file corresponding to the grouped ID code, the processing unit 13 determines that the grouped ID code is not a permissible grouped ID code.
- the processing unit 153 When the grouped ID code is impermissible, the processing unit 153 will retrieve an erroneous message 100 from the database stored in the storage unit 14 and then transmit the erroneous message 100 to the output apparatus 16 via the second transmission interface 15. After receiving the erroneous message 100 from the second transmission interface 15, the output apparatus 16 outputs the erroneous message 100. From the erroneous message 100 shown on the output apparatus 16, the user can learn that the group formed by the information gadgets is meaningless.
- the processing unit 13 determines that the grouped ID code is a permissible grouped ID code, the processing unit 13 will subsequently retrieve a file corresponding to the permissible grouped ID code from the database stored in the storage unit 14 and then transmit the file to the output apparatus 16 via the second transmission interface 15. After receiving the file corresponding to the permissible grouped ID code from the second transmission interface 15, the output apparatus 16 outputs the file so that the user can learn the meaning of the grouped information gadgets.
- a first example is that the information gadgets 112, 113 enter the reading range of the reading apparatus 11 together.
- the reading apparatus 11 reads the 2D barcode 112a, 113a and transmits signals 112g, 113g respectively corresponding to the 2D barcode 112a, 113a to the first transmission interface 12. Meanwhile, the reading apparatus 11 continuously captures an image 122 in real time and transmits the captured images 122 to the first transmission interface 12, wherein the image 122 comprises the information gadget 112, 113 and a scene in front of the reading apparatus 11.
- the processing unit 13 receives the signals 112g, 113g and the captured images 122 via the first transmission interface 12.
- the processing unit 13 determines that there are two information gadgets and then determines whether the distance between the information gadgets 112, 113 is shorter than a predetermined distance. If the distance between the information gadgets 112, 113 is not shorter than a predetermined distance, the processing unit 13 will process them separately. Specifically, the processing unit 13 identifies an ID code 112h, for example 0x330 indicated by the signal 112g and an ID code 113h, for example 0x332, indicated by the signal 113g.
- the processing unit 13 retrieves a first file 112i (for example, an animation showing the stroke order of the Chinese character '3 ⁇ 4") and a second file 113i (for example, an animation showing the stroke order of the Chinese character respectively according to the ID code 112h and the ID code 113h from the database stored in the storage unit 14.
- the processing unit 13 transmits the files 112i, 113i and the captured images 122 to the output apparatus 16 via the second transmission interface 15.
- the output apparatus 16 individually outputs the captured images 122 along with the files 112i, 113i in a user interface, i.e. the display unit individually displays the animations showing the stroke order of the Chinese character "Rg" and the stroke order of the Chinese character "Hf '.
- processing unit 13 determines that the distance between the information gadgets
- the processing unit 13 will subsequently determine the order of the information gadgets 112, 113 according to their relative positions.
- the first situation to be discussed is that the information gadget 113 is placed at the left of the information gadget 112.
- the processing unit 13 determines that the order of the information gadgets 113, 112 is that the information gadgets 113 is prior to the information gadget 112, which means the ID code 113h is prior to the ID code 112h.
- the processing unit 153 the forms a grouped ID code 120h (for example, 0x332-0x330) according to the order of the ID codes 112h, 113h (i.e.
- the processing unit 13 determines whether the grouped ID code 120h is a permissible grouped ID code.
- the grouped ID code 120h is not a permissible grouped ID code (i.e. the database stored in the storage unit 14 does not store a file corresponding to the grouped ID code). Therefore, the processing unit 153 retrieves an erroneous message 100 according to the impermissible grouped ID code 120h from the database stored in the storage unit 14 and then transmits the erroneous message 100 and the captured images 122 to the output apparatus 16 via the second transmission interface 15.
- the output apparatus 16 After receiving the erroneous message 100 and the captured images 122 from the second transmission interface 15, the output apparatus 16 outputs the captured images 122 along with the erroneous message 100 in a user interface. According to the erroneous message 100 shown on the output apparatus 16, a user can learn that placing the information gadget 113 at the left of the information gadget 112 is invalid, i.e. the Chinese phrase is meaningless.
- the processing unit 13 will determine that the order of the information gadget 112, 113 is that the information gadget 112 is prior to the information gadget 113, which means the ID code 112h is prior to the ID code 113h.
- the processing unit 13 then forms a grouped ID code 121h (for example, 0x330-0x332) and determines that the grouped ID code 121h is a permissible grouped ID code (i.e. the database stored in the storage unit 14 stores a file corresponding to the grouped ID code).
- the processing unit 13 retrieves a file 121i (for example, an explanation of the Chinese phrase "H3 ⁇ 43 ⁇ 4") from the database stored in the storage unit 14 according to the permissible grouped ID code 121h and then transmits the file 121i and the captured images 122 to the output apparatus 16 via the second transmission interface 15.
- the output apparatus 16 After receiving the learning material 121i and the captured images 122 from the second transmission interface 15, the output apparatus 16 outputs the captured images 122 along with the file 121i (i.e. the explanation of the Chinese phrase "Dz3 ⁇ 4S") in a user interface.
- the reading apparatus 11 reads the 2D barcodes 112a, 114a and transmits the signals 112g, 114g respectively representing the 2D barcodes 112a, 114a to the first transmission interface 11.
- the processing unit 13 receives the signals H2g, 114g via the first transmission interface 11 and identifies the ID codes 112h, 114h respectively indicated by the signals H2g, 114g. Meanwhile, the processing unit 13 determines that the distance between the information gadgets 112, 114 is shorter than a predetermined distance, and a grouped ID code is formed.
- the processing unit 13 further determines that the grouped ID code is a permissible ID code, then the processing unit 13 retrieves a file 114i corresponding to the grouped ID code from the database stored in the storage unit 14 according to the grouped ID code. Since the information gadgets 114 representing pronunciation, the file 114i is a audio of the traditional Chinese character "[3 ⁇ 4". The processing unit 13 then transmits the file 114i to the output apparatus 16. Since the output apparatus 16 comprises a speaker, then it is able to play the learning material 114i (i.e. the audio).
- the interactive information system 1 may further download a learning material from a Web server.
- the processing unit 13 of the interactive information system 1 is further connected to a Web server directly or through a network (not shown in figures).
- the processing unit 13 receives a learning material from the Web server and transmits the learning material to the to the output apparatus 16 via the second transmission interface 15. From this manner, a user may get more information from the interactive information system 1 which supplied by the Web server and will not be limited to the information gadgets.
- the processing unit 13 of the interactive information system 1 may further record a learning result related to the learning material received from the Web server and transmit the learning result to the Web server for tracking a learning progress of a user.
- the above descriptions are used to illustrate how the information gadgets interact with the interactive information system 1; however, before using the interactive information system 1, it may need an authentication procedure. After pass through the authentication procedure, a use may be authorized to begin the interactions provided by the interactive information system
- the processing unit 13 of the interactive information system 1 is further connected to an authentication server directly or through a network (not shown in figures).
- the authentication procedure begins with an activation gadget which being record with an ID code related to a piece of authentication information. How interactive information system 1 executes the authentication procedure will be described hereafter.
- the reading apparatus 11 read the ID code of the activation gadget and generate an activation signal related to the ID code of the activation gadget. Then, the reading apparatus 11 transmits the activation signal to the processing unit 13 via the first transmission interface 12. After receiving the activation signal from the first transmission interface 12, the processing unit 13 identifies the ID code indicated by the activation signal and executes an activation procedure with the authentication server. After passing through the activation procedure, the authentication server will transmit a correct message to the processing unit 13, and processing unit 13 will transmit the correct message to the output apparatus 16 via the second transmission interface 15. From the correct message shown on the output apparatus 16, a user knows that the interactive information system 1 is ready for use.
- a second embodiment of the present invention is an interactive information system la, whose schematic view is illustrated in FIG. 1G.
- the interactive information system la is the same as the interactive information system 1 except that the reading apparatus 11 is replaced by an input apparatus 11a. Therefore, the composition of the interactive information system la and the connection between each element of the interactive information system la will not be further described herein.
- the input apparatus 11a can be controlled by a user. When the user use the input apparatus 11a to perform a selection, the input apparatus 11a will generate a signal related to the selection.
- the input apparatus 11a is a mouse. In other embodiment may be replaced by a touch panel or other apparatus that can receive an instruction from a user.
- the output apparatus 16 is a display unit, the information gadget is displayed on the display unit beginning from a time before the user uses the input apparatus to achieve a selection.
- FIG. 1H illustrates an example of a frame 18 displayed by the display unit in the second embodiment.
- the frame 18 comprises a first sub-frame 181 for displaying the selected information gadgets and a second sub-frame 183 for displaying a plurality of information gadgets for selection.
- the display unit displays a plurality of information gadgets (i.e. the information gadget 183a, 183b, 183c, and 183d) as a menu bar for selection.
- the input apparatus 11a When a user uses the input apparatus 11a to select at least one information gadget, the input apparatus 11a generates a signal related to the selection.
- the input apparatus 11a assume that the user selects a first information gadget by using a mouse (i.e. the input apparatus 11a).
- the input apparatus 11a will generate a signal related to the selection.
- the processing unit 13a then receives the signal via the first transmission interface 12a.
- the information gadget 181a is selected and displayed within the first sub-frame 181.
- the processing unit 13a After receiving the signal from the first transmission interface 12a, the processing unit 13a identifies an ID code indicated by the signal, wherein the ID code representing the first information gadget. The processing unit 13a then retrieves a first file corresponding to the ID code from the storage unit 14a.
- the first file is the image file 181b as displayed within the first sub-frame 181 which shows the meaning of the information gadget 181a
- a third embodiment of the present invention is an interactive information network 2, whose schematic view is illustrated in FIG. 2.
- the interactive information network 2 comprises a communication network 20, a server 21, and a plurality of user terminals 22, 23, 24, 25 and they are connected via the communication network 20 in a wired way or in a wireless way.
- the communication network 20 is Internet; however, it may be replaced by other network in other embodiments.
- Each of the user terminals 22, 23, 24 comprises a reading apparatus and an output apparatus, such as the reading apparatus 11 and the output apparatus 16 in the first embodiment.
- the server 21 comprises a first transmission interface, a second transmission interface, a processing unit, and a storage unit as the interactive information system 1 of the first embodiment.
- the reading apparatus of the user terminal 22 When an information gadget enter the reading range of the reading apparatus of the user terminal 22, the reading apparatus of the user terminal 22 reads the 2D barcode on the information gadget and transmits a signal representing the 2D barcode to the server 21.
- the processing unit of the server 21 receives the signal via the communication network 20. After receiving the signal from the user terminal 22, the processing unit of the server 21 identifies an ID code indicated by the signal. Then, the processing unit of the server 21 retrieves a file according to the ID code from the storage unit of the server 21 and transmits the file to the user terminal 22 via the communication network 20. After receiving the file from the server 21, the output apparatus of the user terminal 22 outputs the file.
- interaction may also be constructed between the user terminal 22 and the user terminal 23. More details are given below.
- the reading apparatus of the user terminal 22 reads the 2D barcode of an information gadget and transmits a signal representing the 2D barcode to the server 21
- the processing unit of the server 21 receives the signal and identifies the ID code indicated within the signal.
- the processing unit of the server 21 retrieves a file corresponding to the ID code from the database stored in the storage unit of the server 21.
- the processing unit of the server 21 transmits the file to the user terminal 22 and the user terminal 23 via the communication network 20.
- the output apparatus of the user terminal 22 and the output apparatus of the user terminal 23 output the file.
- the server 21 can execute all the operations and functions of the interactive information system 1 set forth in the first embodiment. How the server 21 of second embodiment executes these operations and functions will be readily appreciated by those of ordinary skill in the art based on the explanation of the first embodiment, and thus will not be further described herein.
- FIGs. 3A-3E depict a fourth embodiment of this invention, which is an interactive information method adapted for use in an interactive information system, e.g., the interactive information system 1 described in the first embodiment.
- the signal transmission method of the fourth embodiment may be implemented by a computer program product.
- This computer program product may be stored in a tangible machine-readable medium, such as an ROM, a flash memory, a floppy disk, a hard disk, a compact disk, a mobile disk, a magnetic tape, a database accessible to networks, or any other storage media with the same function and well known to those skilled in the art.
- the interactive information method of the third embodiment comprises the following steps.
- step 301 is executed to enable the reading apparatus to read an information gadget (or information gadgets) and generate a signal related to the information gadget(s) when the information gadget(s) enters the reading range of the reading apparatus and step 302 is executed to enable the reading apparatus to continuously capture an image of a scene in front of the reading apparatus in real time and transmits the signal and the captured images to the processing unit.
- step 303 is executed to enable the processing unit to receive the signal and the captured images from reading apparatus via the first transmission interface and step 304 is executed to enable the processing unit to identify the ID code(s) indicated by the signal.
- step 305 is executed to enable the processing unit to determine whether the number of the information gadgets is greater than one.
- Step 306 is executed to enable the processing unit to retrieve a file from the database stored in the storage unit according to the ID code.
- step 307 is executed to enable the processing unit to transmit the file and the captured images to the output apparatus via the second transmission interface.
- step 308 is executed to enable the display unit to continuously display the captured images of the scene along with the file in a user interface.
- step 309 is executed to enable the processing unit to further determine whether the distance between the information gadgets is shorter than a predetermined distance. If no, please refer to FIG. 3D for the following steps to be executed.
- Step 310 is executed to enable the processing unit to retrieve files from the database stored in the storage unit according to the each of the ID codes.
- step 311 is executed to enable the processing unit to transmit the files and the captured images to the output apparatus via the second transmission interface.
- step 312 is executed to enable the display unit to continuously display the captured images of the scene along with the files in a user interface.
- step 313 is executed in the next to enable the processing unit to determine an order of the information gadgets according to relative positions of the information gadgets. Please refer to FIG 3B for the steps to be executed. Then, step 314 is executed to enable the processing unit to generate a grouped ID representing the combination of the information gadgets according to the order. Next, step 314 is executed to enable the processing unit to determine whether the grouped ID code is a permissible grouped ID code. If no, please refer to FIG 3E for the following steps to be executed. Step 316 is executed to enable the processing unit to retrieve an erroneous message from the database stored in the storage unit.
- step 317 is executed to enable the processing unit to transmit the erroneous message and the captured images to the output apparatus via the second transmission interface.
- step 318 is executed to enable the display unit to continuously display the captured images of the scene along with the erroneous message in a user interface.
- step 315 if the grouped ID code is determined as a permissible grouped ID code, step 319 is executed to enable the processing unit to retrieve a file according to the grouped ID code from the database stored in the storage unit.
- step 320 is executed to enable the processing unit to transmit the file and the captured images to the output apparatus via the second transmission interface.
- step 321 is executed to enable the display unit to continuously display the captured images of the scene along with the file in a user interface.
- the fourth embodiment can also execute all the operations and functions set forth in the first, second, and third embodiments. How the fourth embodiment executes these operations and functions will be readily appreciated by those of ordinary skill in the art based on the explanation of the first, second, and third embodiments, and thus will not be further described herein.
- the processing unit when the processing unit receives ID code (such as 2D barcodes) from the reading apparatus, the processing unit will retrieve a file corresponding to the information gadget(s) from the database stored in the storage unit and transmit the file to the output apparatus. Then, the output apparatus will output the file received from the processing unit.
- ID code such as 2D barcodes
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- General Physics & Mathematics (AREA)
- Tourism & Hospitality (AREA)
- General Health & Medical Sciences (AREA)
- Economics (AREA)
- Health & Medical Sciences (AREA)
- Human Resources & Organizations (AREA)
- Marketing (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- General Business, Economics & Management (AREA)
- Entrepreneurship & Innovation (AREA)
- Electrically Operated Instructional Devices (AREA)
- User Interface Of Digital Computer (AREA)
- Information Transfer Between Computers (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012549229A JP2013518292A (en) | 2010-01-25 | 2010-07-16 | Interactive information system, interactive information method and computer-readable medium thereof |
KR1020127021794A KR20130009748A (en) | 2010-01-25 | 2010-07-16 | Interactive information system, interactive information method, and computer readable medium thereof |
EP10843710.4A EP2529363A4 (en) | 2010-01-25 | 2010-07-16 | Interactive information system, interactive information method, and computer readable medium thereof |
AU2010342986A AU2010342986A1 (en) | 2010-01-25 | 2010-07-16 | Interactive information system, interactive information method, and computer readable medium thereof |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US29803410P | 2010-01-25 | 2010-01-25 | |
US61/298,034 | 2010-01-25 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011088670A1 true WO2011088670A1 (en) | 2011-07-28 |
Family
ID=44295971
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2010/075220 WO2011088670A1 (en) | 2010-01-25 | 2010-07-16 | Interactive information system, interactive information method, and computer readable medium thereof |
Country Status (6)
Country | Link |
---|---|
EP (1) | EP2529363A4 (en) |
JP (1) | JP2013518292A (en) |
KR (1) | KR20130009748A (en) |
CN (1) | CN102136202B (en) |
AU (1) | AU2010342986A1 (en) |
WO (1) | WO2011088670A1 (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI478120B (en) * | 2013-07-08 | 2015-03-21 | Inventec Corp | Learninig system and method thereof |
TWI478121B (en) * | 2013-10-24 | 2015-03-21 | Inventec Corp | Inquiring system for learning image and method thereof |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5991693A (en) * | 1996-02-23 | 1999-11-23 | Mindcraft Technologies, Inc. | Wireless I/O apparatus and method of computer-assisted instruction |
CN1347065A (en) * | 2000-10-09 | 2002-05-01 | 鸿友科技股份有限公司 | Associated instantaneous talking education method |
CN201345173Y (en) * | 2009-02-05 | 2009-11-11 | 刘建生 | Point reading and learning machine based on optical identification |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1996003188A1 (en) * | 1994-07-28 | 1996-02-08 | Super Dimension Inc. | Computerized game board |
JP3792502B2 (en) * | 2000-11-22 | 2006-07-05 | 独立行政法人科学技術振興機構 | Thinking support system |
JP2002344851A (en) * | 2001-05-15 | 2002-11-29 | Nakano Joho Gijutsu Kenkyusho:Kk | Method for reproducing video image for learning materials |
JP2004260711A (en) * | 2003-02-27 | 2004-09-16 | Inst For Information Industry | Method for creating video search database and recording medium |
JP3851907B2 (en) * | 2004-02-18 | 2006-11-29 | 株式会社ソニー・コンピュータエンタテインメント | Image display system and video game system |
JP2005308875A (en) * | 2004-04-19 | 2005-11-04 | Sony Corp | Information retrieval system, information retrieval method, recording medium, and program |
GB2424510A (en) * | 2005-03-24 | 2006-09-27 | Nesta | Interactive blocks. |
JP2006350058A (en) * | 2005-06-17 | 2006-12-28 | Nippon Telegr & Teleph Corp <Ntt> | Intellectual education system |
US7639143B2 (en) * | 2006-09-29 | 2009-12-29 | Intel Corporation | Method and apparatus for visospatial and motor skills testing of patient |
JP2009015130A (en) * | 2007-07-06 | 2009-01-22 | Highware Co Ltd | Dictation server, and dictation terminal |
TW200917171A (en) * | 2007-10-08 | 2009-04-16 | G Time Electronic Co Ltd | Sensing type learning card and learning system thereof |
GB2459490B (en) * | 2008-04-24 | 2012-06-13 | Best Learning Materials Corp | Scan puzzle playing device |
-
2010
- 2010-07-16 EP EP10843710.4A patent/EP2529363A4/en not_active Withdrawn
- 2010-07-16 KR KR1020127021794A patent/KR20130009748A/en not_active Application Discontinuation
- 2010-07-16 WO PCT/CN2010/075220 patent/WO2011088670A1/en active Application Filing
- 2010-07-16 AU AU2010342986A patent/AU2010342986A1/en not_active Abandoned
- 2010-07-16 JP JP2012549229A patent/JP2013518292A/en active Pending
- 2010-07-16 CN CN201010236526.1A patent/CN102136202B/en not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5991693A (en) * | 1996-02-23 | 1999-11-23 | Mindcraft Technologies, Inc. | Wireless I/O apparatus and method of computer-assisted instruction |
CN1347065A (en) * | 2000-10-09 | 2002-05-01 | 鸿友科技股份有限公司 | Associated instantaneous talking education method |
CN201345173Y (en) * | 2009-02-05 | 2009-11-11 | 刘建生 | Point reading and learning machine based on optical identification |
Non-Patent Citations (1)
Title |
---|
See also references of EP2529363A4 * |
Also Published As
Publication number | Publication date |
---|---|
KR20130009748A (en) | 2013-01-23 |
CN102136202B (en) | 2014-10-15 |
EP2529363A1 (en) | 2012-12-05 |
JP2013518292A (en) | 2013-05-20 |
AU2010342986A1 (en) | 2012-08-16 |
EP2529363A4 (en) | 2014-12-24 |
CN102136202A (en) | 2011-07-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8495133B2 (en) | Interactive information system, interactive information method, and computer readable medium thereof | |
Kim et al. | The role of mobile technology in tourism: Patents, articles, news, and mobile tour app reviews | |
Rodgers et al. | Digital Advertising | |
Godwin-Jones | Smartphones and language learning | |
Katsaounidou et al. | MAthE the game: A serious game for education and training in news verification | |
Steinberg | An introduction to communication studies | |
US8867901B2 (en) | Mass participation movies | |
Matsiola et al. | Technology-enhanced learning in audiovisual education: The case of radio journalism course design | |
Okdie et al. | Missed programs (you can’t TiVo this one): Why psychologists should study media | |
Zak | Do you believe in magic? Exploring the conceptualization of augmented reality and its implications for the user in the field of library and information science | |
Brannon et al. | The potential of interactivity and gamification within immersive journalism & interactive documentary (I-Docs) to explore climate change literacy and inoculate against misinformation | |
Yang et al. | Transmedia marketing: strengthening multiplatform user participation through storytelling | |
Pham et al. | PACARD: A New interface to increase mobile learning app engagement, distributed through app stores | |
Thompson et al. | Content production for digital media | |
Thornburg | Producing online news: Digital skills, stronger stories | |
Clinch et al. | Technology-mediated memory impairment | |
US20150309968A1 (en) | Method and System for providing a Story to a User using Multiple Media for Interactive Learning and Education | |
Favaro et al. | Making media data: An introduction to qualitative media research | |
EP2529363A1 (en) | Interactive information system, interactive information method, and computer readable medium thereof | |
KR20170043181A (en) | Server for providing learning service using card contents | |
US20150193845A1 (en) | Method, System and Program Product for Conditional Transfer of Gifts | |
Ulvik et al. | Live remote classroom: A tool for coherent teacher education | |
Van Der Houwen | “If it doesn't make sense it's not true”: how Judge Judy creates coherent stories through “common-sense” reasoning according to the neoliberal agenda | |
TW201126473A (en) | Interactive coding auxiliary learning system and method of the same | |
Hodgson | Immersive storytelling: how 360-degree video storytelling is helping to redefine journalism |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10843710 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012549229 Country of ref document: JP Ref document number: 2010342986 Country of ref document: AU Ref document number: 6542/DELNP/2012 Country of ref document: IN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010843710 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2010342986 Country of ref document: AU Date of ref document: 20100716 Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20127021794 Country of ref document: KR Kind code of ref document: A |