[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN102103707A - Emotion engine, emotion engine system and control method of electronic device - Google Patents

Emotion engine, emotion engine system and control method of electronic device Download PDF

Info

Publication number
CN102103707A
CN102103707A CN2009102580528A CN200910258052A CN102103707A CN 102103707 A CN102103707 A CN 102103707A CN 2009102580528 A CN2009102580528 A CN 2009102580528A CN 200910258052 A CN200910258052 A CN 200910258052A CN 102103707 A CN102103707 A CN 102103707A
Authority
CN
China
Prior art keywords
behavior
behavior pattern
emotion
control module
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2009102580528A
Other languages
Chinese (zh)
Other versions
CN102103707B (en
Inventor
吴立伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Phison Electronics Corp
Original Assignee
Phison Electronics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Phison Electronics Corp filed Critical Phison Electronics Corp
Priority to CN200910258052.8A priority Critical patent/CN102103707B/en
Publication of CN102103707A publication Critical patent/CN102103707A/en
Application granted granted Critical
Publication of CN102103707B publication Critical patent/CN102103707B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Toys (AREA)
  • Manipulator (AREA)

Abstract

The invention relates to an emotion engine, an emotion engine system and a control method of an electronic device, wherein the emotion engine is suitable for the electronic device. The emotion engine system comprises a behavior control unit, a sensing unit, a time unit and a behavior database, wherein the behavior control unit is used for providing a first behavior mode and a second behavior mode; when the sensing unit is enabled, the sensing unit generates triggering sensing information or initial sensing information to the behavior control unit; the time unit generates time information to the behavior control unit; the behavior database stores a plurality of behavior data, wherein the first behavior mode and the second behavior mode respectively correspond to at least one behavior data in the behavior data. In such way, the behavior control unit decides the behavior data corresponding to the second behavior mode according to the time information, the triggering sensing information and the first behavior mode. With the emotion engine, the emotion engine system and the control method of the electronic device provided by the invention, the electronic device is low in manufacturing cost, and capable of finely describing and expressing the personal features of robots.

Description

The control method of emotion engine, emotion engine system and electronic installation
Technical field
The present invention relates to a kind of system and control method thereof, relate in particular to the control method of a kind of emotion engine, emotion engine system and electronic installation.
Background technology
In recent years, constantly weed out the old and bring forth the new owing to have artificial intelligence's electronic installation, its emotion technology requirement little by little comes into one's own.For example, as the electronic installation of robot, its emotion technology (Emotion Technology) all is robotics (Robotics) lining unique technique all the time, and it has comprised the integration technology of information engineering and control engineering.
Specifically, in the emotion technology, most crucial notion is the realization of emotional space (Emotion Space), and its charming part is that control system can and intend give birth to structure Design and give the emotion and interactive ability that robot has approximate life entity by algorithm.It is different from traditional robot can only express the mimicry emotion by ice-cold external form or simple and static interaction mode.Therefore, the characteristics of emotion technology are that it can be given ice-cold train of mechanism and possess more flexible and more significant interaction capability, and show the emotion of robot heart, to allow ice-cold train of mechanism be unlikely to stiff in the process of interaction.
Yet the emotion technology of robot all has many technical bottlenecks always for a long time.In control system, control module not only needs to integrate a large amount of information, more needs the artificial intelligence algorithm of high-order to be used as the identification rule.Therefore, considering down cheaply,, its difficulty will arranged if will build the demand that an emotion engine meets the existing market of putting in a conventional manner.
Summary of the invention
The purpose of the embodiment of the invention provides a kind of emotion engine, can make electronic installation have lower cost of manufacture, and when electronic installation during as the application of robot, its individual character speciality can be described fine and smoothly and express to emotion engine.
The purpose of the embodiment of the invention provides a kind of emotion engine system, can make electronic installation have lower cost of manufacture, and when electronic installation during as the application of robot, its individual character speciality can be described fine and smoothly and express in the emotion engine system.
The purpose of the embodiment of the invention provides a kind of control method of electronic installation.When electronic installation during as the application of robot, it can describe fine and smoothly and express the individual character speciality of robot.
The embodiment of the invention provides a kind of emotion engine, and it is applicable to an electronic installation.This emotion engine comprises a behavior control module, and the behavior control module comprises an emotion analogue unit, triggers sensitive information in order to receive a time information, one, and one first behavior pattern and one second behavior pattern are provided.In view of the above, the behavior control module is according to temporal information, triggering sensitive information and first behavior pattern, determine the pairing behavioral data of second behavior pattern, wherein at least one behavioral data in the corresponding a plurality of behavioral datas of first behavior pattern and second behavior pattern difference.
Another embodiment of the present invention provides a kind of emotion engine system, and it is applicable to an electronic installation.The emotion engine system comprises a behavior control module, a sensing cell, a time unit and a behavior database.The behavior control module comprises an emotion analogue unit, in order to one first behavior pattern and one second behavior pattern to be provided.Sensing cell connects the behavior control module, and when sensing cell is enabled, produces a triggering sensitive information or an initial sensitive information to the behavior control module.Time quantum connects the behavior control module, and the time information that produces is to the behavior control module.In addition, behavior database connects the behavior control module, and stores many behavioral datas, wherein at least one behavioral data in the corresponding behavioral data of first behavior pattern and second behavior pattern difference.In view of the above, the behavior control module determines the pairing behavioral data of second behavior pattern according to temporal information, triggering sensitive information and first behavior pattern.
In an embodiment of the present invention, above-mentioned emotion analogue unit also produces a random signal, and the behavior control module determines the pairing behavioral data of second behavior pattern according to temporal information, triggering sensitive information, first behavior pattern and random signal.
In an embodiment of the present invention, above-mentioned behavior control module according in temporal information and the initial sensitive information at least one of them, determine the pairing behavioral data of first behavior pattern.
In an embodiment of the present invention, above-mentioned emotion engine system also comprises an element driver element, and initial sensitive information is a power supply signal.After the behavior control module receives initial sensitive information, provide first behavior pattern, and the driving element driver element is to carry out first behavior pattern.
In an embodiment of the present invention, above-mentioned after the behavior control module determines the pairing behavioral data of second behavior pattern, the driving element driver element is to carry out second behavior pattern.
In an embodiment of the present invention, above-mentioned element drives unit comprises in a motor control unit and the multi-media voice control module at least one of them, and sensing cell comprises that a contact sensing module, a sound sensing positioning module and rock in the sensing module at least one of them.
In an embodiment of the present invention, above-mentioned temporal information comprises one by receiving behind the initial sensitive information to receiving the time span information that triggers between sensitive information or system's age information of electronic installation.
In an embodiment of the present invention, above-mentioned emotion analogue unit produces an emotion point and a plurality of situation case point of a Virtual Space.At this, each situation case point has the coordinate of a correspondence and the behavior pattern of a correspondence.Emotion point moves to a new coordinate according to temporal information, triggering sensitive information and first behavior pattern by an old coordinate, and the emotion analogue unit is found out corresponding situation case point to determine the pairing behavioral data of second behavior pattern according to new coordinate.
Another embodiment of the present invention provides a kind of control method of electronic installation, and its step is as follows.At first, provide one first behavior pattern.Then, when the sensing cell of electronic installation is enabled, produces one and trigger sensitive information.Afterwards, produce a time information.Then, according to temporal information, triggering sensitive information and first behavior pattern, determine the pairing behavioral data of one second behavior pattern.
In an embodiment of the present invention, above-mentioned control method also comprises generation one random signal, and the pairing behavioral data of second behavior pattern is to determine according to temporal information, triggering sensitive information, first behavior pattern and random signal.
In an embodiment of the present invention, above-mentioned control method also comprises according to an initial sensitive information determining the pairing behavioral data of first behavior pattern.
In an embodiment of the present invention, above-mentioned temporal information comprises one by receiving behind the initial sensitive information to receiving the time span information that triggers between sensitive information or system's age information of electronic installation.
In an embodiment of the present invention, above-mentioned control method also comprises the type of analyzing the triggering sensitive information.
In an embodiment of the present invention, in the step that first behavior pattern is provided, be included in receive a power supply signal after, first behavior pattern is provided, and carries out first behavior pattern.
In an embodiment of the present invention, after determining the pairing behavioral data of second behavior pattern, an element driver element of drive electronics is to carry out second behavior pattern.
In an embodiment of the present invention, in the step that determines the pairing behavioral data of second behavior pattern, comprise an emotion point and a plurality of situation case point that produces in the Virtual Space.At this, emotion point moves according to temporal information, triggering sensitive information and first behavior pattern, and the pairing behavioral data of second behavior pattern is corresponding to the most contiguous situation case point of emotion point.
In an embodiment of the present invention, above-mentioned control method also comprises the following steps.At first, obtain the emotion vector of a correspondence according to first behavior pattern, temporal information and triggering sensitive information.Then, according to the coordinate of emotion vector corrected emotion point in the Virtual Space.Afterwards, a situation case point of the emotion point bee-line of obtaining and revising, and the situation case point that foundation is obtained determines the pairing behavioral data of second behavior pattern.
In an embodiment of the present invention, above-mentioned control method also comprises provides a convergence point, and in the step of the coordinate of foundation emotion vector corrected emotion point in the Virtual Space, provides a recurrence power so that the emotion point moves to convergence point.
In an embodiment of the present invention, the difference of above-mentioned information in time and change the pairing coordinate of convergence point.
In an embodiment of the present invention, above-mentioned Virtual Space is a hyperspace, hyperspace has a plurality of axes of coordinates, and the behavior control module obtains the pairing situation case point of new coordinate with a vector operation, and each axes of coordinates of Virtual Space is represented the different state of mind behavior of electronic installation respectively.
In an embodiment of the present invention, the situation case point that above-mentioned foundation is obtained determines the step of the pairing behavioral data of second behavior pattern to comprise foundation one system's age information and the situation case point of obtaining, and provides second behavior pattern pairing behavioral data.
Based on above-mentioned, emotion engine that the embodiment of the invention provided and system thereof, it can make electronic installation have lower cost of manufacture, and when electronic installation during as the application of robot, its individual character speciality can be described fine and smoothly and express in emotion engine and system thereof, but and have the design of modularization framework, be not limited to the external form and the design of specific electronic installation.In addition, the control method of the electronic installation that the embodiment of the invention provided, it utilizes the instant emotion computing and the notion of emotion point.Therefore, when electronic installation during as the application of robot, it can describe fine and smoothly and express the individual character speciality of robot.Considering down cheaply, can meet the demand of existing market.
For above-mentioned feature and advantage of the present invention can be become apparent, embodiment cited below particularly, and be described in detail below in conjunction with the accompanying drawings.
Description of drawings
Fig. 1 is the synoptic diagram of the emotion engine system of one embodiment of the invention.
Fig. 2 is the detailed maps of the emotion engine system of Fig. 1.
Fig. 3 is the synoptic diagram that the emotional responses of robot operates in the Virtual Space.
Fig. 4 is the process flow diagram of the robot control method of one embodiment of the invention.
Fig. 5 is the detail flowchart of step S407 among Fig. 4.
The main element symbol description:
100: the emotion engine system; 110: the behavior control module;
111: the tense control module; 113: the emotion analogue unit;
112: original state; 114: children's tense module;
116: growth/transition interface; 118: adult's tense module;
114a, 118a: general modfel; 114b, 118b: active event pattern;
114c, 118c: passive event schema; 114d, 118d: sleep pattern;
114e, 118e: the pattern of getting up; 114f, 118f: random behavior pattern;
120: sensing cell; 130: the element drives unit;
130a: motor control unit; 130b: multi-media voice control module;
120a: contact sensing module; 120b: sound sensing positioning module;
120c: rock the sensing module; 140: behavior database;
140a: the behavioral data of children's tense module; 140b: the behavioral data of adult's tense module;
140c: index search unit; 150: time quantum;
152: the physiological clock record sheet; S: emotion point;
Q: situation case point; T: motion track;
V: emotion vector; P: convergence point;
R: interval radius; F: recurrence power;
S401, S403, S405, S407, S409: step;
S501, S503, S505, S507, S509, S511: step.
Embodiment
Fig. 1 is the synoptic diagram of the emotion engine system of one embodiment of the invention.Please refer to Fig. 1, the emotion engine system 100 of present embodiment comprises a behavior control module 110, a sensing cell 120, a time unit 150, a behavior database 140 and an element driver element 130, and wherein behavior control module 110 comprises a tense control module 111 and an emotion analogue unit 113.At this, sensing cell 120, time quantum 150 and behavior database 140 connect behavior control module 110 respectively, provide one first behavior pattern and one second behavior pattern in order to assist behavior control module 110.
In the present embodiment, emotion engine system 100 is applicable to that one has the electronic installation of emotional responses, as robot, electronic pet (not shown).Be artificially routine in the present embodiment with machine, but not as limit.The emotional responses of robot can be the reaction mechanism that produced with user's interaction, behavior performance that time quantum 150 sets or because of the corresponding behavior that demand produced of outside situation.
Below, will be example with emotion engine system 100, its function mode is done more detailed description.
In the present embodiment, tense control module 111 and emotion analogue unit 113 can the firmware pattern be arranged in one or more control circuits.For example, to comprise that a plurality of steering orders (for example are burned onto a program storage, in the ROM (read-only memory) (Read Only Memory abbreviates as: ROM)) and connect a microprocessor and form behavior control module 110 with tense control module 111 and emotion analogue unit 113.Therefore, when 100 runnings of emotion engine system, can carry out to finish the emotion engine administrative mechanism of the embodiment of the invention by microprocessor in order to a plurality of steering orders that form tense control module 111 and emotion analogue unit 113.
When behavior control module 110 receives an initial sensitive information, behavior control module 110 can provide one first behavior pattern, and driving element driver element 130 execution first behavior pattern, wherein initial sensitive information can be power supply signal or other drive signal.At this, the behavior pattern that robot showed can be get up in a pattern and the random behavior pattern one of them of general modfel, an active event pattern, a passive event schema, a sleep pattern,, and the present invention is not limited to this.For example, when the power supply of robot was unlocked, behavior control module 110 promptly provided the pattern of getting up, to express robot as corresponding behavior or state that living individual was showed fine and smoothly.
In the present embodiment, sensing cell 120 connects behavior control module 110.When robot was subjected to outside stimulation, sensing cell 120 can be enabled, and triggered sensitive information to behavior control module 110 and produce one.For example, when robot and user's interaction, sensing cell 120 produces according to its interaction mode and triggers sensitive information to behavior control module 110.For example, when robot is stroked, pats, shakes by the user or robot when being subjected to clashing into, behavior control module 110 can be judged the type of the triggering sensitive information that is received according to sensing cell 120, and cooperate other reference information, and provide robot to show as corresponding emotional reactions such as joyful, angry, sadnesss.
In addition, time quantum 150 connects behavior control module 110.In the present embodiment, time quantum 150 can comprise a physiology clock log table 152, system's age information of robot in order to record, and within a certain period of time, produce a time information, provide to behavior control module 110, and then make behavior control module 110 produce different interactive demands with the user according to the different life stage of robot.In addition, after behavior control module 110 received initial sensitive information and triggers sensitive information, time quantum 150 also can provide 110 1 time of behavior control module length information.At this, time span information is that behavior control module 110 receives initial sensitive information to the mistiming of triggering sensitive information, determines the pairing behavioral data of second behavior pattern to assist behavior control module 110.
Therefore, the behavior control module 110 of emotion engine system 100 can be according to the physiological clock and the life stage of robot, express the behavior performance of robot fine and smoothly, the corresponding behavior or the state that are showed in different life stage processes as living individual.
In the present embodiment, behavior database 140 is in order to storing many behavioral datas, and the behavior pattern of robot at least one behavioral data in the corresponding behavioral data respectively.In the emotion technology, above-mentioned many behavioral datas can be corresponding to the arbitrary coordinate points in the Virtual Space, and the Virtual Space can be the realization of emotional space.At this, the Virtual Space can be the space of a two dimension (2-Dimension), three-dimensional (3-Dimension) or multidimensional, and behavior database 140 many stored behavioral datas can comprise image data, voice data and motor behavior data, and the present invention is not limited to this.
In the present embodiment, behavior control module 110 can determine the pairing behavioral data of second behavior pattern according to temporal information, triggering sensitive information and first behavior pattern.For example, the time quantum of emotion engine system 100 150 at noon the time send a time information so that robot produces a behavior in order to need to propose the demand of feeding to the user.Simultaneously, the user is feeding after 5 minutes, and sensing cell 120 can produce the triggering sensitive information of reception feeding to behavior control module 110, so that behavior control module 110 control robot show corresponding emotional reactions.For example, if the quantity of user's feeding is not enough, then behavior control module 110 control robot produce angry emotional reactions.Otherwise then robot produces the emotional reactions that satisfy.At this, time quantum 150 temporal information sent of time at noon is an initial sensitive information, and robot generation this moment is in order to need the behavior of the demand of feeding can correspond to first behavior pattern to user's proposition.Then, robot accepts user's feeding after waiting for 5 minutes, and this wait of 5 minutes also is a time information, and the signal of this feeding then can be one and triggers sensitive information.Therefore, behavior control module 110 is after receiving the signal of feeding, then according to proposing to need the demand of feeding, the time of wait and the quantity of feeding, judge to produce satisfied, angry or the emotional reactions of the demand of feeding again.At this, corresponding emotional reactions can be corresponding to second behavior pattern.Therefore, behavior control module 110 can determine the pairing behavioral data of second behavior pattern according to temporal information, triggering sensitive information and first behavior pattern.
Illustrate with another embodiment again.For example, robot is subjected to a beating when 11 of nights.Then, robot is subjected to another beating again after 1 minute.At this moment, robot can produce the very angry emotional reactions of an expression.But if robot just was subjected to secondary beating after 1 hour, because robot is long with the interval time of patting for the first time, its mood is calmed down.Moreover the robot physiological time has been shown as a length of one's sleep.Therefore, the emotional reactions of its generation can be slight anger reactions in 10 fens.
Therefore, by the foregoing description as can be known, temporal information (promptly reception behind the initial sensitive information) to receiving the time span information that triggers sensitive information, trigger sensitive information and first behavior pattern all can exert an influence to determining the pairing behavioral data of second behavior pattern.
In addition, in the present embodiment, when behavior control module 110 received power supply signal, behavior control module 110 also can provide first behavior pattern.For example, behavior control module 110 control robot are failure to actuate or are carried out behavior performance of greeting etc.But in other embodiments, behavior control module 110 also can or trigger sensitive information according to temporal information, determines the pairing behavioral data of first behavior pattern, carries out first behavior pattern with the drive machines people.That is to say that first behavior pattern does not only limit corresponding to the default behavioral data in start back, and can or trigger sensitive information according to temporal information, determine the pairing behavioral data of first behavior pattern.
Specifically, in the present embodiment, behavior control module 110 comprises emotion analogue unit 113.
Emotion analogue unit 113 is in order to produce a Virtual Space with emotion point and a plurality of situation case points.Emotion point and situation case point correspond respectively to the coordinate points in the Virtual Space, and this emotion point can produce accordingly and move because of temporal information, initial sensitive information, behavior control module 110 then can be according to the position after this emotion point change, find out the situation case point the most contiguous, to determine first behavior pattern with it.In addition, the emotion point may move to another coordinate points because of first behavior pattern, temporal information, triggering sensitive information again, behavior control module 110 can be once more according to the position after the change of emotion point, find out the situation case point the most contiguous, and determine the pairing behavioral data of second behavior pattern with it.At this, behavior database 140 stored behavioral datas correspond respectively to the different situation case points in the Virtual Space.That is to say that each situation case point corresponds respectively to the coordinate points in the Virtual Space, and the emotion point is constantly to move according to each information in the Virtual Space.In addition.Emotion analogue unit 113 can carry out a mathematics algorithm to obtain the situation case point near emotion point, to determine the pairing behavioral data of second behavior pattern.To be described further among each embodiment below this.
In another embodiment, emotion analogue unit 113 is also in order to produce a random signal, so that emotion point also can change because of random signal, not only can be subjected to the stimulation of external signal with dummy robot's mood, more can react the inherent indefinite mood of oneself changes, and produce variation and the behavior pattern of exchange premium human nature more.Therefore, with another angle, in the present embodiment, behavior control module 110 can determine the pairing behavioral data of second behavior pattern according to temporal information, triggering sensitive information, first behavior pattern and random signal.In like manner, behavior control module 110 also can determine the pairing behavioral data of first behavior pattern according to temporal information, initial sensitive information and random signal.
In an embodiment again, the behavior control module can also comprise a tense control module 111, in order to the temporal information that foundation time quantum 150 provides, adjust first behavior pattern or the pairing behavioral data of second behavior pattern that behavior control module 110 is provided.For example, but temporal information display device people is in adult phase or juvenile phase, and it is then not necessarily identical to the reaction that is subjected to same triggering sensitive information for different phase.For example, the emotional reactions that were in when the adult phase, robot was subjected to patting are a little indignation, but the emotional reactions that are in the robot of juvenile phase then are anger and sad.
Fig. 2 is the detailed maps of the emotion engine system of Fig. 1.Please refer to Fig. 2, in the present embodiment, tense control module 111 comprises children's tense module 114 and adult's tense module 118.Each tense module comprises a plurality of behavior patterns.For example, children's tense module 114 comprises general modfel 114a, active event pattern 114b, passive event schema 114c, sleep pattern 114d, the pattern of getting up 114e and random behavior pattern 114f.Similarly, in adult's tense module 118, also comprise above-mentioned various pattern at least, i.e. general modfel 118a, active event pattern 118b, passive event schema 118c, sleep pattern 118d, the pattern of getting up 118e and random behavior pattern 118f.
Specifically, tense control module 111 can be according to system's age of robot, express the behavior performance of robot at different tense modules fine and smoothly, the corresponding behavior or the state that are showed in different life stage processes as living individual.For example, when emotion engine system 100 after initialization, system's age of the robot that the life stage of robot is write down according to the physiological clock record sheet 152 of time quantum 150 is by original state 112 beginnings, enter children's tense module 114, to carry out corresponding behavior pattern.Then, along with the growth at robot system age, the tense module of tense control module 111 is switched to adult's tense module 118 via growth/transition interface 116.
In children's tense module 114, tense control module 111 can be randomly or the temporal information that produced by time of reception unit 150, the behavior that comes control robot to take place.For example, tense control module 111 is when carrying out active event pattern 114b, and the may command robot proposes initiatively demand to the user, and requires to satisfy.In the present embodiment, robot proposes initiatively demand to the user, and the incident that requires to satisfy can be robot through a schedule time or feel hungry randomly and then propose the demand of needs feeding, or robot proposes the demand that need urinate and defecate behind feeding.In addition, active event pattern 114b also can be the action that tense control module 111 control robot are carried out book or newspaper reading.
In addition, when carrying out passive event schema 114c, tense control module 111 may command robots carry out corresponding performance to the behavior state of its passive reception.For example, when robot is stroked, drops or pat by the user, system's age of the robot that tense control module 111 is write down according to physiological clock record sheet 152 cooperates the situation case point of emotion analogue unit 113, and that control robot shows is joyful, pain or happy performance.In addition, in random behavior pattern 114f, tense control module 111 may command robots carry out various random behaviors, for example: game mode or roam mode.In the present embodiment, when tense control module 111 is switched to children's tense module 114, the physiological clock of the robot that is write down according to physiological clock record sheet 152, tense control module 111 also can control robot be carried out general modfel 114a, sleep pattern 114d and the pattern 114e that gets up.
Afterwards, along with the growth at robot system age, the tense module of tense control module 111 is switched to adult's tense module 118 via growth/transition interface 116.Similarly, in adult's tense module 118, tense control module 111 also can be independently or by with the cooperating of emotion analogue unit 113, driving element driver element 130 control robot are carried out the correspondence performance of various actions pattern.It should be noted that in adult's tense module 118, the correspondence performance of the various actions pattern that robot is performed may be different with its corresponding performance performed in children's tense module 114.For example, when carrying out sleep pattern 118d, robot the needed length of one's sleep can be shorter than its needed length of one's sleep in sleep pattern 114d.In addition, when pattern 118e was got up in execution, the needed time of being feeling too lazy to get out of bed of robot can be shorter than its needed time of being feeling too lazy to get out of bed in the pattern 114e that gets up.Certainly, the present invention is not limited to this.In other embodiments, the corresponding behavior or the state that can be showed in different life stage processes according to living individual carry out slightly adjustment to the behavior pattern in the various different tense modules.
Therefore, tense control module 111 can produce the emotional reactions of different times by the common virtual space, and controls peripheral system by driving element driver element 130.Thus, tense control module 111 just can be according to system's age of robot, express the behavior performance of robot at different tense modules fine and smoothly, the corresponding behavior or the state that are showed in different life stage processes as living individual.
Please refer to Fig. 2, in the present embodiment, time quantum 150 comprises physiological clock record sheet 152, with recorder people's system's age and physiological clock thereof, and can switch the tense module of tense control module 111 by temporal information.In addition, physiological clock record sheet 152 is in order to write down a bionical life cycle and a physiology clock period.Corresponding behavior or state that the bionical life cycle is showed in different life stage processes for the simulation living individual, and the physiological clock cycle is the work and rest of simulation living individual daily life.In the present embodiment, bionical life cycle and physiological clock cycle can be set in advance in the physiological clock record sheet 152, but the present invention is not limited to this.
Therefore, time quantum 150 can switch the tense module of tense control module 111 according to system's age and the relation of bionical life between the cycle that it write down.For example, along with the growth at system's age, the tense module that time quantum 150 can switch tense control modules 111 by children's tense module 114 to adult's tense module 118.Specifically, when emotion engine system 100 after initialization, system's age of the robot that physiological clock record sheet 152 is write down can be 0 years old.Afterwards, as time goes by, physiological clock record sheet 152 continues system's age of the robot of record.For example, when the real time through one the time, system's age of the robot of physiological clock record sheet 152 records is 1 years old.Therefore, when real time during through two days, system's age of robot is 2 years old, by that analogy.Certainly, the present invention is not limited to this.
Then, time quantum 150 can switch the tense module of tense control module 111 according to system's age and the relation of bionical life between the cycle that it write down.For example, if the bionical life cycle enters adult's tense when the predetermined system age is 20 years old, then when real time during through 20 days, time quantum 150 will switch the tense module of tense control module 111 to the tense module 118 of being grown up.Therefore, along with the growth at robot system age, the tense module of tense control module 111 can be switched via growth/transition interface 116 to adult's tense module 118.
In addition, when receiving the temporal information of time quantum 150, tense control module 111 also can be according to physiological clock and physiological clock the relation between the cycle, control robot is carried out corresponding one of them in the above-mentioned behavior pattern.Specifically, if carry out the pattern of getting up during 7 of default mornings in physiological clock cycle, then when the real time was 7 of mornings, time quantum 150 can provide temporal information to tense control module 111, the pattern so that the execution of tense control module 111 control robot is got up.Similarly, if carry out the sleep pattern during at 7 in default evening in physiological clock cycle, then when the real time was at 7 in evening, tense control module 111 can be according to temporal information control robot execution sleep pattern.In like manner, the user can preestablish the physiological clock cycle in the specific time, makes robot carry out in the above-mentioned behavior pattern corresponding one of them.
Therefore, emotion engine system 100 just can be according to system's age and bionical life the relation between the cycle, and physiological clock and the relation of physiological clock between the cycle, express the behavior performance of robot at different tense modules fine and smoothly, and in the specific time, make robot carry out default behavior pattern, the state that is showed in different life stage processes as living individual, and living individual is in the pairing physiology behavior of different time.
In the present embodiment, sensing cell 120 is in order to detecting or the different enable signal of sensing, and then provides behavior control module 110 corresponding initial or trigger sensing signals.Sensing cell 120 comprises that a contact sensing module 120a, a sound sensing positioning module 120b and rock sensing module 120c.
Element drives unit 130 is controlled by behavior control module 110, carries out corresponding behavior in order to drive peripheral unit.Element drives unit 130 comprises a motor control unit 130a, a multi-media voice control module 130b.
For example, when the user sent whomp arround robot, the sound sensing positioning module 120b of sensing cell 120 can be enabled and detect the orientation at the user place of sending whomp, and produces a sensing signal to behavior control module 110.Therefore, behavior control module 110 can make robot move toward the orientation that detects by CD-ROM drive motor control module 130a by this.Then, when robot arrives the user and sends the position of whomp, but multi-media voice control module 130b just the drive machines people send the default sound, with user's interaction.At this moment, if the user is stroked robot or when rocking, contact sensing module 120a or rock sensing module 120c and just can transmit its sensitive information to behavior control module 110,110 of behavior control modules drive multi-media voice control module 130b again and the user carries out other interaction.
In addition, the pairing behavioral data of behavior pattern that behavior database 140 stores in each tense module, and the pairing behavioral data of the emotional responses of robot, and a plurality of situation case points in the above-mentioned behavioral data difference corresponding fields space.In addition, behavior database 140 stored behavioral datas comprise image data, voice data and motor behavior data.In the present embodiment, image data can be the lattice array picture library of light emitting diode picture element (LED pixel).Voice data can be various scene audio information, and it can do the adjustment of the change of voice or sounding speed by multimedia wafer (media chip), to reach the demand that various situations change.The motor behavior data can be the information of relevant various motor attitude action and motion control track.
Specifically, in the present embodiment, behavior database 140 has stored the pairing image data of behavior pattern, voice data and the motor behavior data in children's tense module 114.Therefore, when the corresponding tables now that behavior control module 110 is carried out the various actions pattern by element drives unit 130 control robot, robot can show corresponding behavior reaction according to the behavioral data 140a of the stored children's tense module of behavior database 140.Thus, the behavior performance of robot can be expressed fine and smoothly by emotion engine system 100.Similarly, behavior database 140 has also stored the behavioral data 140b of the pairing adult's tense of the behavior pattern module in adult's tense module 118, so that the behavior performance of robot in adult's tense module 118 is same fine and smooth.
Hence one can see that, the behavior control module 110 of this example can determine the pairing behavioral data of first behavior pattern according to temporal information, initial sensitive information, or, determine the pairing behavioral data of second behavior pattern according to temporal information, triggering sensitive information and first behavior pattern.
Fig. 3 is the synoptic diagram that the emotional responses of robot operates in the Virtual Space.Please also refer to Fig. 2 and Fig. 3, the Virtual Space of present embodiment can be produced by behavior control module 110, it can be a tri-vector space, its axes of coordinates is represented the happy degree (Pleasantness) of robot, clear-headed degree (Arousal) and concentration degree (Certainty) respectively, to express the emotion scalar values of bionical life individuality.But the present invention is not limited to this, and it also can be applicable to two dimension, or in the Virtual Space of other multidimensional.In addition, also can to represent can be angry degree, sad degree, randomness, intelligence degree or other different state of mind behaviors to each axes of coordinates.
It should be noted that, in the Virtual Space, axes of coordinates APC (Arousal, Pleasantness, Certainty) is psychologic notion, and the individual different emotion point of bionical life is positioned at different coordinates in the Virtual Space, and the corresponding different respectively behavior pattern of each emotion point.Moreover in the Virtual Space, emotion point S is an emotional state of representing robot current.For example, when robot and user's interaction, the emotional responses meeting that it produced influences the position of emotion point.In addition, the behavior performance that emotion engine system 100 internal physiology clocks set, and the position that also all can influence emotion point because of the corresponding behavior that demand produced of outside situation.
At this, behavior database 140 stored behavioral datas then correspond respectively to a plurality of situation case point Q in the Virtual Space.In the present embodiment, emotion point S can produce an off-set value randomly and form a random vibration interval of thereupon moving in an interval radius R, so that the emotional responses of robot is unlikely to stiff.In addition, along with the variation of time, the emotional responses of the emotion analogue unit 113 may command robots in the behavior control module 110 forms a motion track T in the Virtual Space, with the emotional responses of imitation living individual under different situations.Specifically, a computing that quantizes is done in the stimulation that emotion analogue unit 113 can produce in response to the variation of external environment condition, and produces emotion vector V by this in the Virtual Space to pass emotion point S, with the emotion variation of imitation living individual.At this, the variation of external environment condition and the stimulation that produces for example are temporal informations, initial or trigger sensitive information, first behavior pattern and random signal etc. and quantize and get, but the present invention is not limited to this.
In another embodiment, the Virtual Space of behavior control module 110 generations also can have a convergence point P.This convergence point P can be positioned at initial point or arbitrary coordinate points of Virtual Space, in order to represent the individual character speciality of robot.For example, the coordinate of convergence point P has bigger happy degree and lucidity, and the individual character speciality of then representing robot is for more optimistic, rational.On the contrary, if the coordinate of convergence point P has less happy degree and bigger concentration degree, then may represent robot comparatively stiff, stubborn.At this, convergence point P can immediately provide a virtual recurrence power.In other words, convergence point P can provide recurrence power according to temporal information.When robot did not receive sensing signal, emotion point S can move to convergence point P, was to return back to individual character speciality originally gradually with the emotional responses of dummy robot when not being upset.And then when robot was upset, the recurrence power of its convergence point P also can provide behavior control module 110 in order to determining suitable behavior pattern, when receiving similar events as, produced different emotional responsess with the different individual character speciality of dummy robot.In other words, receive behind the initial sensitive information, also can influence the pairing behavioral data of second behavior pattern because of the recurrence power of convergence point P to receiving the time span information that triggers between sensitive information.
In another embodiment, in order to reach the diversified demand of emotion individual character of imitative life entity, behavior control module 110 is the may command convergence point P coordinate position that changes its place according to the time or the erratic mode of different rules also, to reach to its individual character convergent feature.For example, behavior control module 110 during the phase, moves this convergence point P to happy degree, lucidity and all lower coordinate points of concentration degree at a predetermined physiology according to the information of time quantum 150.Perhaps, in childhood or when adult, behavior control module 110 control convergence point P make it be in different coordinate positions.So, behavior control module 110 can allow robot have near the really mood performance of life.
Furthermore, when robot was subjected to outside stimulus, emotion analogue unit 113 produced emotion vector V according to outside stimulus to emotion point S, and convergence point P then provides one to return power F to emotion point S, makes that by this emotion point S moves to another coordinate points.Then, why emotion analogue unit 113 calculates this moment and the immediate situation case point of emotion point S Q, with the behavior pattern that determines to carry out.
In the present embodiment, the emotion that process the experienced point of emotion computing can be selected the emotional expression mode of robot instantly according to the minimum distance of itself and situation case point, and it comprises the intensity and the degree of depth of emotion.In the Virtual Space, the situation case point is mainly being put down in writing a kind of expression characteristic of particular emotion, and the expression characteristic of this kind particular emotion can be happy when robot, excited, fear or when sad its performed behavior performance.
For example, when the user sent whomp arround robot, for behavior control module 110, it was the stimulation of an outside or can be described as interaction with the user.At this moment, the emotional responses of robot is by the emotion point in Virtual Space beginning, by the computing of emotion analogue unit 113, makes its emotion point along with the variation of time, forms a motion track.In the process of emotion computing, the emotion point can be selected the emotional expression mode of robot instantly according to the minimum distance of itself and situation case point.Afterwards, at convergence point P and return under the pullling of power F, to restore emotion.
Certainly, in this process, robot also might be subjected to other stimulation once more, and expresses specific emotional responses.For example, when robot during in the stimulation that is subjected to whomp for the first time, carries out happy behavior performance, and in the process of emotion computing, when robot during in the stimulation that is subjected to whomp for the second time, the behavior that can carry out helpless or doubt etc. shows.
In the present embodiment, even robot is not subjected to any stimulation, emotion analogue unit 113 also can carry out the emotion computing, and is so that robot shows various emotional responses, unlikely too stiff.
In addition, in the present embodiment, can increase situation case point in the Virtual Space, reach the difference on the robot individual character by increasing behavior database 140 stored behavioral datas.In addition, behavior database 140 each stored behavioral data are independent, and by index search unit 140c, the coordinate position in the Virtual Space, mark are lived each situation index relative.By the running of behavior database 140, it can be sorted out behavioral data according to various actions performance and emotional responses, and according to the demand of tense control module 111, switches its corresponding behavioral data.
According to the content that the foregoing description disclosed, below provide a kind of ROBOT CONTROL method.Fig. 4 is the process flow diagram of the robot control method of one embodiment of the invention.In the present embodiment, the ROBOT CONTROL method is applicable to the robot with emotional responses.Above-mentioned control method comprises following step.
At first, in step S401,, provide one first behavior pattern by behavior control module 110.At this, behavior control module 110 can receive an initial sensitive information after start, and the first default behavior pattern is provided.In the present embodiment, initial sensitive information can be a power supply signal, but the present invention is not limited to this.
Then, in step S403, when sensing cell 120 is enabled, produces one and trigger sensitive information to behavior control module 110.At this, emotion engine system 100 can obtain environmental information by sensing cell 120, to differentiate the environment interaction behavior type.That is to say, when robot is subjected to outside stimulation, but activation sensing cell 120.
Afterwards, in step S405, time quantum 150 produces a time information to behavior control module 110, provides the information that determines that second behavior is required in order to the tense module that switches tense control module 111 or other.At this, determine that the required information of second behavior can be to receive behind the initial sensitive information to receiving the time span information that triggers between sensitive information.In other embodiments, also can be before step S403 first execution in step S405 so that temporal information to be provided.
Then, in step S407,, determine the pairing behavioral data of second behavior pattern according to temporal information, triggering sensitive information, first behavior pattern.At this, the mode that determines the pairing behavioral data of second behavior pattern can be to carry out the emotion computing by emotion analogue unit 113.That is to say that by the emotion computing of emotion analogue unit 113, the decision emotion is put the emotional expression mode of robot instantly.
At last, in step S409, driving element driver element 130 is to carry out second behavior pattern.Afterwards, emotion point S can and return under the pullling of power F at convergence point P, and P moves to convergence point.
In another embodiment, also can further produce a random signal, and, determine the pairing behavioral data of second behavior pattern according to temporal information, triggering sensitive information, first behavior pattern and random signal.
Fig. 5 is the detail flowchart of step S407 among Fig. 4.Please refer to Fig. 5, in step S501, by behavior control module 110 receiving time informations and triggering sensitive information.Then, in step S503, by the emotion computing of emotion analogue unit 113, the type of foundation first behavior pattern, temporal information and triggering sensitive information or dynamics draw corresponding emotion vector V and the power of recurrence F.Afterwards, in step S505, the random signal that provides according to emotion analogue unit 113 is passed the emotion point S that is positioned at old coordinate again, and obtains the new coordinate of emotion point S, shown in step S507.Then, in step S509, emotion analogue unit 113 can be found out situation case point with the new coordinate minimum distance of emotion point S to offer tense control module 111.At step S511,111 foundations of tense control module tense and situation case point at that time provides the emotional expression mode of robot instantly.
In addition, what deserves to be mentioned is that first behavior pattern does not only limit corresponding to the default behavioral data in start back, and can determine the pairing behavioral data of first behavior pattern according to temporal information or initial sensitive information.
In sum, emotion engine that the embodiment of the invention provided and automotive engine system thereof, it can describe fine and smoothly and express the individual character speciality of electronic installation, but and have the design of modularization framework, be not limited to the external form and the design of specific electronic installation.In addition, the electronic apparatus control method that the embodiment of the invention provided, it utilizes the instant emotion computing and the notion of convergence point, can describe fine and smoothly and express the individual character speciality of robot.
It should be noted that at last: above embodiment is only in order to technical scheme of the present invention to be described but not limit it, although the present invention is had been described in detail with reference to preferred embodiment, those of ordinary skill in the art is to be understood that: it still can make amendment or be equal to replacement technical scheme of the present invention, and these modifications or be equal to replacement and also can not make amended technical scheme break away from the spirit and scope of technical solution of the present invention.

Claims (25)

1. an emotion engine is suitable for an electronic installation, it is characterized in that, described emotion engine comprises:
One behavior control module, comprise an emotion analogue unit, trigger sensitive information in order to receive a time information, one, and provide one first behavior pattern and one second behavior pattern, and described behavior control module is according to described temporal information, described triggering sensitive information and described first behavior pattern, determine the pairing behavioral data of described second behavior pattern
At least one behavioral data in wherein said first behavior pattern and the corresponding a plurality of behavioral datas of described second behavior pattern difference.
2. emotion engine according to claim 1, it is characterized in that, wherein said emotion analogue unit also produces a random signal, and described behavior control module determines the pairing behavioral data of described second behavior pattern according to described temporal information, described triggering sensitive information, described first behavior pattern and described random signal.
3. emotion engine according to claim 1 is characterized in that, wherein said behavior control module also according to an initial sensitive information, determines the pairing behavioral data of described first behavior pattern.
4. emotion engine according to claim 3, it is characterized in that wherein said temporal information comprises one by receiving behind the described initial sensitive information to receiving time span information between described triggering sensitive information or system's age information of described electronic installation.
5. emotion engine according to claim 3, it is characterized in that, wherein said initial sensitive information is a power supply signal, and described behavior control module is after receiving described initial sensitive information, described first behavior pattern is provided, and drives an element driver element to carry out described first behavior pattern.
6. emotion engine according to claim 1 is characterized in that, wherein said behavior control module also behind the pairing behavioral data of described second behavior pattern of decision, drives an element driver element to carry out described second behavior pattern.
7. emotion engine according to claim 1, it is characterized in that, wherein said emotion analogue unit produces an emotion point and a plurality of situation case point of a Virtual Space, wherein each described situation case point has the coordinate of a correspondence and the behavior pattern of a correspondence, described emotion point moves to a new coordinate according to described temporal information, described triggering sensitive information and described first behavior pattern by an old coordinate, and described emotion analogue unit obtains corresponding situation case point according to described new coordinate, to determine the pairing behavioral data of described second behavior pattern.
8. an emotion engine system is applicable to an electronic installation, it is characterized in that, described emotion engine system comprises:
One behavior control module comprises an emotion analogue unit, in order to one first behavior pattern and one second behavior pattern to be provided;
One sensing cell connects described behavior control module, and when described sensing cell is enabled, produces a triggering sensitive information or an initial sensitive information to described behavior control module;
One time unit connects described behavior control module, and the time information that produces is to described behavior control module; And
One behavior database connects described behavior control module, and stores many behavioral datas,
At least one behavioral data in respectively corresponding described many behavioral datas of wherein said first behavior pattern and described second behavior pattern, and described behavior control module determines the pairing behavioral data of described second behavior pattern according to described temporal information, described triggering sensitive information and described first behavior pattern.
9. emotion engine according to claim 8 system, it is characterized in that, wherein said emotion analogue unit also produces a random signal, and described behavior control module determines the pairing behavioral data of described second behavior pattern according to described temporal information, described triggering sensitive information, described first behavior pattern and described random signal.
10. emotion engine according to claim 8 system is characterized in that, wherein said behavior control module according in described temporal information and the described initial sensitive information at least one of them, determine the pairing behavioral data of described first behavior pattern.
11. emotion engine according to claim 8 system, it is characterized in that, also comprise an element driver element, wherein said initial sensitive information is a power supply signal, and after described behavior control module receives described initial sensitive information, described first behavior pattern is provided, and drives described element drives unit to carry out described first behavior pattern.
12. emotion engine according to claim 11 system is characterized in that, wherein after described behavior control module determines the pairing behavioral data of described second behavior pattern, drives described element drives unit to carry out described second behavior pattern.
13. the control method of an electronic installation is suitable for the electronic installation of a tool emotion engine, it is characterized in that described control method comprises:
One first behavior pattern is provided;
When a sensing cell of described electronic installation is enabled, produces one and trigger sensitive information;
Produce a time information; And
According to described temporal information, described triggering sensitive information and described first behavior pattern, determine the pairing behavioral data of one second behavior pattern.
14. control method according to claim 13, it is characterized in that, also comprise producing a random signal, and the pairing behavioral data of described second behavior pattern is to determine according to described temporal information, described triggering sensitive information, described first behavior pattern and described random signal.
15. control method according to claim 13 is characterized in that, also comprises according to an initial sensitive information determining the pairing behavioral data of described first behavior pattern.
16. control method according to claim 15, it is characterized in that wherein said temporal information comprises one by receiving behind the described initial sensitive information to receiving time span information between described triggering sensitive information or system's age information of described electronic installation.
17. control method according to claim 16 is characterized in that, also comprises the type of analyzing described triggering sensitive information.
18. control method according to claim 13 is characterized in that, wherein provides described first behavior pattern to comprise:
After receiving a power supply signal, provide described first behavior pattern; And
Carry out described first behavior pattern.
19. control method according to claim 13 is characterized in that, wherein behind the pairing behavioral data of described second behavior pattern of decision, an element driver element that drives described electronic installation is to carry out described second behavior pattern.
20. control method according to claim 13 is characterized in that, wherein in the step of the pairing behavioral data of described second behavior pattern of decision, comprising:
Produce an emotion point and a plurality of situation case point in the Virtual Space,
Wherein said emotion point moves according to described temporal information, described triggering sensitive information and described first behavior pattern, and the pairing behavioral data of described second behavior pattern is corresponding to the most contiguous situation case point of described emotion point.
21. control method according to claim 20 is characterized in that, also comprises:
Obtain the emotion vector of a correspondence according to described first behavior pattern, described temporal information and described triggering sensitive information;
According to the coordinate of the described emotion point of described emotion vector corrected in described Virtual Space;
Obtain a situation case point with the emotion point bee-line of described correction; And
Determine the pairing behavioral data of described second behavior pattern according to the described situation case point of obtaining.
22. control method according to claim 21, it is characterized in that, also comprise a convergence point is provided, and in step, provide a recurrence power so that described emotion point moves to described convergence point according to the coordinate of the described emotion point of described emotion vector corrected in described Virtual Space.
23. control method according to claim 22 is characterized in that, wherein the difference with described temporal information changes the pairing coordinate of described convergence point.
24. control method according to claim 20, it is characterized in that, wherein said Virtual Space is a hyperspace, described hyperspace has a plurality of axes of coordinates, and described control method is obtained the pairing situation case point of a new coordinate with a vector operation, and described a plurality of axes of coordinates of described Virtual Space are represented the different state of mind behavior of described electronic installation respectively.
25. control method according to claim 21, it is characterized in that, wherein determine the step of the pairing behavioral data of described second behavior pattern to comprise, provide described second behavior pattern pairing behavioral data according to system's age information and the described situation case point of obtaining according to the described situation case point of obtaining.
CN200910258052.8A 2009-12-16 2009-12-16 Emotion engine, emotion engine system and control method of electronic device Active CN102103707B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN200910258052.8A CN102103707B (en) 2009-12-16 2009-12-16 Emotion engine, emotion engine system and control method of electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN200910258052.8A CN102103707B (en) 2009-12-16 2009-12-16 Emotion engine, emotion engine system and control method of electronic device

Publications (2)

Publication Number Publication Date
CN102103707A true CN102103707A (en) 2011-06-22
CN102103707B CN102103707B (en) 2014-06-11

Family

ID=44156456

Family Applications (1)

Application Number Title Priority Date Filing Date
CN200910258052.8A Active CN102103707B (en) 2009-12-16 2009-12-16 Emotion engine, emotion engine system and control method of electronic device

Country Status (1)

Country Link
CN (1) CN102103707B (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104516873A (en) * 2014-12-12 2015-04-15 北京智谷睿拓技术服务有限公司 Method and device for building emotion model
CN104881108A (en) * 2014-02-27 2015-09-02 青岛海尔机器人有限公司 Intelligent man-machine interaction method and device
CN105389735A (en) * 2015-11-18 2016-03-09 重庆理工大学 Multi-motive emotion generation method based on SPFA algorithm
CN106113062A (en) * 2016-08-23 2016-11-16 陈中流 One is accompanied and attended to robot
CN106462255A (en) * 2016-06-29 2017-02-22 深圳狗尾草智能科技有限公司 A method, system and robot for generating interactive content of robot
CN106471444A (en) * 2016-07-07 2017-03-01 深圳狗尾草智能科技有限公司 A kind of exchange method of virtual 3D robot, system and robot
CN106537425A (en) * 2016-06-29 2017-03-22 深圳狗尾草智能科技有限公司 Method and system for generating robot interaction content, and robot
CN106537293A (en) * 2016-06-29 2017-03-22 深圳狗尾草智能科技有限公司 Method and system for generating robot interactive content, and robot
CN106662932A (en) * 2016-07-07 2017-05-10 深圳狗尾草智能科技有限公司 Method, system and robot for recognizing and controlling household appliances based on intention
CN106662931A (en) * 2016-07-07 2017-05-10 深圳狗尾草智能科技有限公司 Robot man-machine interactive system, device and method
WO2018000266A1 (en) * 2016-06-29 2018-01-04 深圳狗尾草智能科技有限公司 Method and system for generating robot interaction content, and robot
WO2018006373A1 (en) * 2016-07-07 2018-01-11 深圳狗尾草智能科技有限公司 Method and system for controlling household appliance on basis of intent recognition, and robot
CN110871813A (en) * 2018-08-31 2020-03-10 比亚迪股份有限公司 Control method and device of virtual robot, vehicle, equipment and storage medium
TWI698757B (en) * 2018-05-29 2020-07-11 英屬開曼群島商意騰科技股份有限公司 Smart engine with dynamic profiles and method of operating smart engine
TWI833646B (en) * 2023-05-12 2024-02-21 緯創資通股份有限公司 Mobility aids, mobility aids auxiliary systems and operating method thereof

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI577194B (en) * 2015-10-22 2017-04-01 山衛科技股份有限公司 Environmental voice source recognition system and environmental voice source recognizing method thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1149156A (en) * 1995-06-20 1997-05-07 松下电器产业株式会社 Data retrieval apparatus, evaluation entry apparatus and sensibility entry apparatus
CN1313781A (en) * 1999-04-30 2001-09-19 索尼公司 Electronic pet system, network system, robot and storage medium
JP2003089077A (en) * 2001-09-12 2003-03-25 Toshiba Corp Robot
US20030080989A1 (en) * 1998-01-23 2003-05-01 Koichi Matsuda Information processing apparatus, method and medium using a virtual reality space
CN1553845A (en) * 2001-11-07 2004-12-08 索尼公司 Robot system and robot apparatus control method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1149156A (en) * 1995-06-20 1997-05-07 松下电器产业株式会社 Data retrieval apparatus, evaluation entry apparatus and sensibility entry apparatus
US20030080989A1 (en) * 1998-01-23 2003-05-01 Koichi Matsuda Information processing apparatus, method and medium using a virtual reality space
CN1313781A (en) * 1999-04-30 2001-09-19 索尼公司 Electronic pet system, network system, robot and storage medium
JP2003089077A (en) * 2001-09-12 2003-03-25 Toshiba Corp Robot
CN1553845A (en) * 2001-11-07 2004-12-08 索尼公司 Robot system and robot apparatus control method

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104881108A (en) * 2014-02-27 2015-09-02 青岛海尔机器人有限公司 Intelligent man-machine interaction method and device
CN104881108B (en) * 2014-02-27 2018-08-31 青岛海尔机器人有限公司 A kind of intelligent human-machine interaction method and device
CN104516873A (en) * 2014-12-12 2015-04-15 北京智谷睿拓技术服务有限公司 Method and device for building emotion model
CN105389735A (en) * 2015-11-18 2016-03-09 重庆理工大学 Multi-motive emotion generation method based on SPFA algorithm
WO2018000261A1 (en) * 2016-06-29 2018-01-04 深圳狗尾草智能科技有限公司 Method and system for generating robot interaction content, and robot
CN106462255A (en) * 2016-06-29 2017-02-22 深圳狗尾草智能科技有限公司 A method, system and robot for generating interactive content of robot
CN106537425A (en) * 2016-06-29 2017-03-22 深圳狗尾草智能科技有限公司 Method and system for generating robot interaction content, and robot
CN106537293A (en) * 2016-06-29 2017-03-22 深圳狗尾草智能科技有限公司 Method and system for generating robot interactive content, and robot
WO2018000260A1 (en) * 2016-06-29 2018-01-04 深圳狗尾草智能科技有限公司 Method for generating robot interaction content, system, and robot
WO2018000267A1 (en) * 2016-06-29 2018-01-04 深圳狗尾草智能科技有限公司 Method for generating robot interaction content, system, and robot
WO2018000266A1 (en) * 2016-06-29 2018-01-04 深圳狗尾草智能科技有限公司 Method and system for generating robot interaction content, and robot
CN106471444A (en) * 2016-07-07 2017-03-01 深圳狗尾草智能科技有限公司 A kind of exchange method of virtual 3D robot, system and robot
CN106662931A (en) * 2016-07-07 2017-05-10 深圳狗尾草智能科技有限公司 Robot man-machine interactive system, device and method
CN106662932A (en) * 2016-07-07 2017-05-10 深圳狗尾草智能科技有限公司 Method, system and robot for recognizing and controlling household appliances based on intention
WO2018006373A1 (en) * 2016-07-07 2018-01-11 深圳狗尾草智能科技有限公司 Method and system for controlling household appliance on basis of intent recognition, and robot
WO2018006380A1 (en) * 2016-07-07 2018-01-11 深圳狗尾草智能科技有限公司 Human-machine interaction system, device, and method for robot
WO2018006372A1 (en) * 2016-07-07 2018-01-11 深圳狗尾草智能科技有限公司 Method and system for controlling household appliance on basis of intent recognition, and robot
CN106113062A (en) * 2016-08-23 2016-11-16 陈中流 One is accompanied and attended to robot
TWI698757B (en) * 2018-05-29 2020-07-11 英屬開曼群島商意騰科技股份有限公司 Smart engine with dynamic profiles and method of operating smart engine
US11501135B2 (en) 2018-05-29 2022-11-15 British Cayman Islands Intelligo Technology Inc. Smart engine with dynamic profiles
CN110871813A (en) * 2018-08-31 2020-03-10 比亚迪股份有限公司 Control method and device of virtual robot, vehicle, equipment and storage medium
TWI833646B (en) * 2023-05-12 2024-02-21 緯創資通股份有限公司 Mobility aids, mobility aids auxiliary systems and operating method thereof

Also Published As

Publication number Publication date
CN102103707B (en) 2014-06-11

Similar Documents

Publication Publication Date Title
CN102103707B (en) Emotion engine, emotion engine system and control method of electronic device
US7219064B2 (en) Legged robot, legged robot behavior control method, and storage medium
EP1508409A1 (en) Robot device and robot control method
CN110139732B (en) Social robot with environmental control features
TWI413938B (en) Emotion engine, emotion engine system and electronic device control method
US8209179B2 (en) Speech communication system and method, and robot apparatus
US7987091B2 (en) Dialog control device and method, and robot device
US8145492B2 (en) Robot behavior control system and method, and robot apparatus
CN101795831B (en) Robot control system, robot, program, and information recording medium
US20180133900A1 (en) Embodied dialog and embodied speech authoring tools for use with an expressive social robot
KR100843822B1 (en) Robot device, method for controlling motion of robot device, and system for controlling motion of robot device
US10576618B2 (en) Robot having communication with human, robot control method, and non-transitory recording medium
US20170024095A1 (en) Multi-media platform
KR20030074473A (en) Method and apparatus for speech synthesis, program, recording medium, method and apparatus for generating constraint information and robot apparatus
KR20180129886A (en) Persistent companion device configuration and deployment platform
JP2005193331A (en) Robot device and its emotional expression method
US20240190012A1 (en) Robot and control method of robot
CN110196632A (en) Information processing unit, information processing method and program
US20030056252A1 (en) Robot apparatus, information display system, and information display method
US20240027977A1 (en) Method and system for processing input values
JP2006110707A (en) Robot device
JP4556425B2 (en) Content reproduction system, content reproduction method, and content reproduction apparatus
JP4552465B2 (en) Information processing apparatus, action control method for robot apparatus, robot apparatus, and computer program
JP2001157980A (en) Robot device, and control method thereof
JP2001157981A (en) Robot device and control method thereof

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant