US20190000041A1 - Mobile Object Avoiding Mobile Platform - Google Patents
Mobile Object Avoiding Mobile Platform Download PDFInfo
- Publication number
- US20190000041A1 US20190000041A1 US16/041,855 US201816041855A US2019000041A1 US 20190000041 A1 US20190000041 A1 US 20190000041A1 US 201816041855 A US201816041855 A US 201816041855A US 2019000041 A1 US2019000041 A1 US 2019000041A1
- Authority
- US
- United States
- Prior art keywords
- mobile device
- mobile
- spatial
- employing
- movement
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 241001465754 Metazoa Species 0.000 claims description 17
- 238000012545 processing Methods 0.000 claims description 17
- 238000001514 detection method Methods 0.000 claims description 13
- 238000005259 measurement Methods 0.000 claims description 9
- 241000282326 Felis catus Species 0.000 claims description 5
- 238000004891 communication Methods 0.000 description 32
- 238000010586 diagram Methods 0.000 description 26
- 238000000034 method Methods 0.000 description 20
- 230000008569 process Effects 0.000 description 9
- 230000007246 mechanism Effects 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 5
- 230000001133 acceleration Effects 0.000 description 4
- 230000003993 interaction Effects 0.000 description 4
- 230000005611 electricity Effects 0.000 description 3
- 230000006855 networking Effects 0.000 description 3
- 230000006399 behavior Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 238000013439 planning Methods 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 230000007474 system interaction Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- OBCDEBWFWLLANO-UHFFFAOYSA-N C1C=C=[O]C=C1 Chemical compound C1C=C=[O]C=C1 OBCDEBWFWLLANO-UHFFFAOYSA-N 0.000 description 1
- 206010012586 Device interaction Diseases 0.000 description 1
- 241001025261 Neoraja caerulea Species 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000002329 infrared spectrum Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 230000010355 oscillation Effects 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000007723 transport mechanism Effects 0.000 description 1
- 238000012384 transportation and delivery Methods 0.000 description 1
- 238000001429 visible spectrum Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01K—ANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
- A01K15/00—Devices for taming animals, e.g. nose-rings or hobbles; Devices for overturning animals in general; Training or exercising equipment; Covering boxes
- A01K15/02—Training or exercising equipment, e.g. mazes or labyrinths for animals ; Electric shock devices ; Toys specially adapted for animals
- A01K15/027—Exercising equipment, e.g. tread mills, carousels
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01K—ANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
- A01K15/00—Devices for taming animals, e.g. nose-rings or hobbles; Devices for overturning animals in general; Training or exercising equipment; Covering boxes
- A01K15/02—Training or exercising equipment, e.g. mazes or labyrinths for animals ; Electric shock devices ; Toys specially adapted for animals
- A01K15/021—Electronic training devices specially adapted for dogs or cats
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
- G01C21/206—Instruments for performing navigational calculations specially adapted for indoor navigation
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0016—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the operator's input device
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
- G05D1/028—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using a RF signal
- G05D1/0282—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using a RF signal generated in a local control room
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/72427—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting games or graphical animations
-
- H04M1/72544—
-
- G05D2201/0214—
Definitions
- FIG. 1 is a system diagram of an animal exerciser system as per an aspect of an embodiment of the present invention.
- FIG. 2 is a system diagram of an animal exerciser system as per an aspect of an embodiment of the present invention.
- FIG. 3 is a system diagram of an animal exerciser system as per an aspect of an embodiment of the present invention.
- FIG. 4A is a system diagram of a mobile device as per an aspect of an embodiment of the present invention.
- FIG. 4B is a block diagram of a mobile device platform as per an aspect of an embodiment of the present invention.
- FIG. 5 is a system diagram of an obstacle system as per an aspect of an embodiment of the present invention.
- FIG. 6A is a system diagram of a screen display as per an aspect of an embodiment of the present invention.
- FIG. 6B is a system diagram of a screen display as per an aspect of an embodiment of the present invention.
- FIG. 7 is a flow diagram of an avoidance process as per an aspect of an embodiment of the present invention.
- FIG. 8 is a block diagram of a system interaction as per an aspect of an embodiment of the present invention.
- FIG. 9 is a flow diagram of an avoidance process as per an aspect of an embodiment of the present invention.
- FIG. 10 is a flow diagram of a calibration process as per an aspect of an embodiment of the present invention.
- FIG. 11 is a flow diagram of a calibration process as per an aspect of an embodiment of the present invention.
- FIG. 12 is a flow diagram of an obstacle detection process as per an aspect of an embodiment of the present invention.
- FIG. 13 is a flow diagram of a mobile object detection process as per an aspect of an embodiment of the present invention.
- FIG. 14 is a flow diagram of an escape process as per an aspect of an embodiment of the present invention.
- FIG. 15 is a block diagram of a computer system as per an aspect of an embodiment of the present invention.
- Embodiments may be employed to exercise and/or entertain an animal such as a cat and/or a dog.
- Embodiments comprise a system that may comprise a mobile device, a spatial profiling device, and a mobile device manager.
- the mobile device may comprise a platform, a first communications circuit that receives navigation instructions, and a motion drive configured to propel the platform according to the navigation instructions.
- the spatial profiling device may capture a spatial profile of a motion area.
- the mobile device manager may comprise at least one second communications circuit that communicates with the first communications circuit and the spatial profiling device, processor(s), and memory storing processing instructions.
- the instructions when executed, may cause the mobile device manager to receive spatial profiles, locate the mobile object, locate the mobile device, plan a movement for the mobile device which avoids the mobile object, and communicate the movement to the mobile device as navigation instructions.
- Embodiments may comprise a process that may comprise receiving a spatial profile from a spatial profiling device, locating a mobile object employing the spatial profiles, locating a mobile device, planning a movement for the mobile device between a first location and a second location, and communicating the movement to the mobile device as navigation instructions employing a communications circuit.
- the movement may be configured to avoid the mobile object.
- Embodiments may comprise a non-transitory tangible computer readable medium containing instructions configured to cause one or more processors to execute a process comprising: receiving spatial profiles from a spatial profiling device, locating a mobile object employing the spatial profiles, locating a mobile device, planning a movement for the mobile device between a first location and a second location, and communicating the movement to the mobile device as navigation instructions employing a communications circuit.
- the movement may be configured to avoid the mobile object.
- an animal exerciser system 100 is shown.
- the system 100 may comprise a mobile device 140 , a spatial profiling device 120 , and a mobile device manager 110 .
- at least the mobile device 140 and a mobile object 150 may be disposed within a motion area 130 .
- the mobile device 140 may further comprise a platform 142 , a motion drive 144 , and a communications circuit 146 .
- the motion drive 144 and communications circuit 146 may be disposed on the platform 142 .
- Motion drive 144 may be configured to receive navigation instructions and to propel platform 142 according to the navigation instructions. To accomplish this movement, motion drive 144 may comprise a battery to supply electricity, DC or AC motors, and control mechanisms such as, for example, an H bridge. Motion drive 144 may propel the platform using mechanisms such as wheels, flopping wheels, tracks, plungers, legs, magnets, compressed air, a combination thereof, and/or the like.
- the at least one communications circuit 146 may be configured to communicate with communications circuit 112 disposed on the mobile device manager 110 via navigation instructions 148 . This communication may occur via a wired interface, a wireless interface, a combination thereof, and/or the like. The communication may utilize wireless communication protocols such as Bluetooth®, LTE, Wi-Fi, radio waves, a combination thereof, and/or the like.
- the communications circuit 146 may send and/or receive navigation instructions 148 . According to the navigation instructions 148 , the motion drive 144 may be configured to propel the platform 142 to move mobile device 140 .
- the spatial profiling device 120 may comprise one or more sensors configured to collect spatial information in and/or around the motion area 130 . Examples of sensors comprise cameras, proximity sensors, motion sensors, a combination thereof, and/or the like.
- the spatial profiling device 120 may comprise a webcam, infrared projector, 3D scanner systems, such as, for example, a Microsoft KinectTM, a combination thereof, and/or the like.
- the spatial profiling device 120 may operate using pairs of emitters and detectors to detect objects.
- the spatial profiling device may capture spatial profiles 125 of the motion area 130 . Spatial profiles 125 may comprise images captured by sensors, such as a camera, or composites of the motion area 130 created using proximity sensors, motion sensors, a combination thereof, and/or the like.
- the spatial profiles 125 may be updatable and represent a depiction of the motion area 130 .
- the mobile device manager 110 may comprise at least one communications circuit 112 , one or more processors 114 , and memory 116 .
- the at least one communications circuit 112 may be configured to receive spatial profiles 125 from the spatial profiling device 120 .
- the at least one communications circuit 112 may be configured to communicate with communications circuit 146 disposed on the mobile device 140 via navigation instructions 148 .
- the communications circuit 146 may direct the movement of mobile device 140 employing the navigation instructions 148 .
- Processors 114 may comprise a microprocessor produced by microprocessor manufacturers such as Advanced Micro Devices, Inc. (AMD) of Sunnyvale, Calif., Atmel Corporation of San Jose, Calif., Intel Corporation of Santa Clara, Calif., or Texas Instruments Inc. of Dallas, Tex.
- AMD Advanced Micro Devices, Inc.
- Processors 114 may comprise and/or be other logic-based controllers such as FPGAs or PLCs.
- Memory 116 may comprise nonvolatile memory configured to store processing instructions. Examples of memory 116 comprise ROM, EEPROM, Flash, a combination thereof, and/or the like. Memory 116 may comprise volatile memory such as, for example, RAM.
- Contained within memory 116 may be instructions 117 that, when executed, may cause the mobile device manager 110 to receive spatial profiles 125 from the spatial profiling device and locate mobile object 150 employing the spatial profiles 125 .
- Mobile device manager 110 may locate mobile device 140 .
- Mobile device manager 110 may plan a movement for mobile device 140 between a first location and a second location in motion area 130 . The movement may be configured to avoid mobile object 150 .
- the movement may be communicated to mobile device 140 as navigation instructions 148 by employing communications circuit 112 .
- the processing instructions 117 may cause the mobile device manager 110 to plan the movement employing an expected movement of mobile object 150 .
- the mobile device 140 may be located using infrared light.
- the mobile device 140 may be located employing a wheel encoder.
- Mobile device manager 110 may distinguish colors in received spatial profiles 125 .
- Mobile device manager 110 may detect a distance of at least one of the mobile device 140 and the mobile object 150 from a known location in the received spatial profiles 125 .
- the mobile object 150 may be located employing motion detection and background detection techniques.
- mobile device manager 110 may comprise a touch screen display 118 .
- the mobile device manager 110 may comprise a device containing a touch screen display such as, for example, a mobile phone, tablet, desktop, a laptop computer, a combination thereof, and/or the like.
- memory 116 may contain instructions that, when executed, cause the mobile device manager 110 to show at least one of the spatial profiles on touch screen display 118 .
- the mobile device manager 110 may also determine a second location via a selection on touch screen display 118 .
- the mobile object 150 may be an animal such as a cat, a dog, a human, and/or the like.
- the mobile object 150 may comprise a second mobile device.
- the second mobile device may be similar to mobile device 140 .
- mobile object 150 may be an item that has the capacity to move.
- the motion area 130 may comprise a region of space in which the mobile device 140 and/or the mobile object 150 may operate.
- the mobile device 140 and/or the mobile object 150 may move in motion area 130 .
- spatial profiling device 120 may be configured to capture portions of motion area 130 and/or the entirety of motion area 130 .
- motion area 130 may be a space within a home dwelling, a room, and/or the like.
- system 200 may comprise a mobile device 240 and an obstacle 260 .
- a spatial profiling device 220 that may be configured to capture a two or three dimensional spatial profile, may be positioned relative to and/or on the obstacle 260 .
- System 200 may operate within motion area 230 and interact with mobile object 250 .
- Mobile object 250 may comprise, as illustrated in this example, a cat.
- mobile device 240 may move to avoid mobile object 250 by positioning itself (mobile device 240 ) such that obstacle 260 is between mobile device 240 and mobile object 250 .
- obstacle 260 which may comprise processing features, may utilize a spatial profile of motion area 230 .
- the spatial profile may be created employing spatial profiling device 220 .
- obstacle 260 may comprise a preexisting piece of furniture either in a dwelling and/or outdoors.
- the processing may be performed in an attachment to the preexisting obstacle 260 .
- spatial profiling device 220 may comprise a camera and the spatial profiles captured may comprise images.
- the spatial profiling device 220 may utilize a wide-angle lens.
- spatial profiling device 220 may be positioned vertically above obstacle 260 so that spatial profiling device 220 may identify obstacle 260 within motion area 230 .
- spatial profiling device 220 may be affixed on an arm that is attached to obstacle 260 at a height that allows spatial profiling device 220 to obtain a spatial profile of motion area 230 .
- spatial profiling device 220 may identify mobile device 240 and mobile object 250 using spatial profiling assist equipment.
- mobile device 240 may emit a beacon using a light emitting diode that may be in the visible and/or infrared spectrum that spatial profiling device 220 is configured to detect.
- the spatial profiling device 220 may utilize motion detection techniques.
- mobile device 240 may move to a position such that obstacle 260 is positioned between mobile device 240 and mobile object 250 .
- system 300 may comprise a mobile device 340 , an obstacle 360 , and a spatial profiling device 320 .
- System 300 may operate within motion area 330 and interact with a mobile object 350 .
- Mobile object 350 may be an animal such as the illustrated cat.
- system 300 may utilize a spatial profiling device 320 that is detached from obstacle 360 .
- spatial profiling device 320 may be attached to, for example, the ceiling of a room in order to capture motion area 330 .
- Spatial profiling device 320 may comprise one or more sensors such as cameras, proximity sensors, motion sensors, a combination thereof, and/or the like.
- Motion area 330 may be a two and/or three dimensional space.
- Spatial profiling device 320 may also reside in alternative locations, such as, for example, on a tabletop, a counter, a shelf, other existing furniture within a room, a combination thereof, and/or the like.
- spatial profiling device 320 may be mounted outside of a building, such as on the exterior of a building and/or home.
- mobile device 440 may comprise a mobile device platform 442 , wheels 420 A and 420 B coupled to wheel encoders, a stability nub 430 , and a tail 460 .
- Mobile device platform 442 may be a surface upon which circuitry may be disposed.
- platform 442 may be an insulated sheet and/or a type of circuit board.
- Mobile device platform 442 may be disposed with a plastic casing while wheels 420 A and 420 B may reside on the sides of the casing.
- the wheel encoders coupled to wheels 420 A and 420 B may provide information to the mobile device 440 circuitry and may be utilized, at least in part, to determine distance, speed, acceleration, combination thereof, and/or the like.
- Stability nub 430 may be positioned to balance the movement of mobile device 440 .
- stability nub 430 may be a wheel.
- stability nub 430 may be a spherically shaped plastic.
- platform 442 and wheels 420 A and 420 B may be placed in a manner configured to allow mobile device 440 to remain mobile even when in a flipped orientation.
- tail 460 may comprise an attachment to entice the attention of a mobile object, such as a cat.
- Tail 460 may be a string but may also comprise various colors, noise makers such as a bell, other attention generating items, a combination thereof, and/or the like.
- FIG. 4B is a block diagram of mobile device platform 442 .
- platform 442 may comprise a communications circuit 446 and/or a motion drive 444 .
- Communications circuit 446 may comprise circuitry configured to interface with other components contained on mobile device platform 442 .
- Communication circuit 446 may be configured to communicate with systems external to the mobile device. For example, communications circuit 446 may send data concerning mobile device 440 such as distance measurements, speed measurements, inertial measurements, a combination thereof, and/or the like. Further, communications circuit 446 may be configured to receive instructions from an external system that may direct the movement of mobile device 440 .
- Motion drive 444 may be configured to receive navigation instructions and to move wheels 420 A and 420 B according to those instructions. To accomplish this movement, motion drive 444 may comprise a battery to supply electricity, DC or AC motors, and/or control mechanisms such as an H bridge. Motion drive 444 may also operate without employing wheels 420 A and 420 B and still propel platform 442 using such mechanisms such as tracks, plungers, legs, magnets, compressed air, a combination thereof, and/or the like.
- Platform 442 may house a beacon 422 .
- Beacon 422 may emit an electromagnetic signal.
- An electromagnetic signal may comprise a modulated wave or synchronized oscillations of electric and magnetic fields. Examples of electromagnetic signals comprise a signal in the ultraviolet, visible light, infrared, radio wave spectrum, a combination thereof, and/or the like.
- the signal emitted by beacon 422 may allow an external imaging device to detect mobile device 440 .
- Platform 442 may house an inertial measurement device 424 .
- the inertial measurement device 424 may comprise, for example, a device configured to measure changes in acceleration, magnitude, and/or direction. Examples comprise an accelerometer and/or gyroscope configured to measure changes in acceleration of the mobile device 440 . This information may be employed to determine the orientation of mobile device 440 , collisions, unlevel terrain, other types of interactions that mobile device 440 may have with the environment, a combination thereof, and/or the like.
- Platform 442 may operate without reference to an external system and house a spatial profiling device 420 along with a mobile device manager 410 .
- This autonomous embodiment may utilize spatial profiling device 420 to generate spatial profile(s) of the environment in which mobile device 440 operates.
- Spatial profiling device 420 may comprise a camera mounted such that the lens captures visual information above mobile device 440 .
- Spatial profiles may be created employing the images captured by the lens.
- Spatial profiling device 420 may comprise a light emitter and detector pair to generate a spatial profile.
- a light emitter such as a light emitting diode may produce electromagnetic waves such as infrared light, ultraviolet light, visible light, a combination thereof, and/or the like.
- the detector may be, for example, a light emitting diode, a photodiode, a phototransistor, a combination thereof, and/or the like.
- the detector may be configured to capture reflections of the emitted light, and using the reflections, create a spatial profile of the environment surrounding mobile device 440 . That is, the spatial profiles may map the locations of objects as well as predict the location of obstacles.
- spatial profiling device 420 may comprise several emitter/detector pairs.
- mobile device manager 410 may direct the movement of mobile device 440 to avoid obstacles as well as other mobile objects that may obstruct the movement of mobile device 440 .
- Mobile device manager 410 may utilize communications circuit 446 to control motion drive 444 to execute these maneuvers.
- Mobile device manager 410 may also utilize instructions received by the communications circuit with or without information from the spatial profiling device 420 in order to direct the movement of mobile device 440 .
- Mobile device 440 may operate without the need for any external navigation instructions but may still have the capability to receive and utilize commands or instructions sent from an external system.
- obstacle system 500 may comprise a base obstacle 560 , a spatial profiling device 520 , a mobile device manager 510 , and one or more tunnels 570 .
- Base obstacle 560 may comprise preexisting furniture within a home and/or outdoors, but may also be a specially designed structure.
- spatial profiling device 520 may attach to base obstacle 560 by, for example, an arm that supports the spatial profiling device 520 .
- Spatial profiling device 520 may communicate with mobile device manager 510 .
- mobile device manager 510 may detect the locations of mobile objects and/or mobile devices and plot navigation procedures for mobile devices. Further, mobile device manager 510 may be configured to communicate with mobile devices, transmitting and/or receiving information such as navigation instructions, spatial profile information generated by the mobile device, mobile device location information, mobile object location information, a combination thereof, and/or the like.
- Obstacle system 500 may comprise one or more tunnels 570 that may be utilized by a mobile device.
- tunnels 570 may take various shapes and may be large enough to house the mobile device. Tunnels 570 may also pass completely through or only partially through base obstacle 560 .
- Screen display 600 may appear on a desktop or laptop computer.
- Screen display 600 may appear on a device with a touch screen interface such as a mobile phone, a tablet, and/or the like.
- Screen display 600 may display an image generated from a camera.
- Screen display 600 may display an image that is being captured in real-time.
- screen display 600 may display a mobile device 640 , an obstacle 660 , a starting location 680 , and a final location 690 .
- Mobile device 640 may be a remotely controlled device.
- a user may direct the movement of mobile device 640 based on the visual information provided to the user on screen display 600 .
- the user may provide an input and specify a final location 690 for the mobile device 640 to move.
- User input may come from a selection using computer periphery such as a mouse click and/or a tap on a touch screen display.
- a user could potentially shift the area displayed on the screen, allowing the user to make a final location 690 selection beyond the initial frame shown.
- FIG. 6A illustrates a user selection of a final location 690 on the opposite side of obstacle 660 relative to the starting location 680 of the mobile device 640 .
- FIG. 6B shows screen display 600 after the mobile device 640 has moved from starting location 680 to final location 690 .
- mobile device 640 may follow path 685 .
- path 685 may be curved and may curve around obstacle 660 so that mobile device 640 may arrive at final location 690 without colliding with obstacle 660 .
- Proximity sensors on mobile device 640 may be employed to avoid collision.
- Image recognition employing the imaging device used to generate the image displayed on screen display 600 , may be employed to plan a movement based on visually recognizing obstacle 660 .
- Computation to plan and execute the mobile device 640 movement may occur on a mobile device manager that may be disposed on the user input device, the mobile device 640 , or within obstacle 660 .
- FIG. 7 is a flow diagram of an avoidance method 700 .
- spatial profiles may be received from a spatial profiling device at 710 .
- the received spatial profiles may be employed in locating a mobile object.
- a mobile device may be located.
- a movement for the mobile device may be planned.
- the movement may be configured to avoid the mobile object.
- the avoidance may comprise identifying a hiding location that may be outside of the view of the mobile object.
- the movement may be planned such that an obstacle may be positioned between the mobile device and the mobile object.
- the planned movement may be communicated to the mobile device.
- FIG. 8 is a block diagram of system interaction 800 .
- information may flow into and out of mobile device manager 810 .
- spatial profiles e.g., 820 A and/or 820 B
- spatial profiling device e.g., 850 A and/or 850 B
- mobile device manager 810 may receive information concerning the mobile device location 830 from mobile device 860 .
- mobile device manager 810 may send navigation instructions 840 to mobile device 860 .
- a spatial profile (e.g., 820 and/or 820 B), may be generated employing a spatial profiling device (e.g., 850 A and/or 850 B).
- Spatial profiling device 850 B may reside on mobile device 860 .
- Spatial profiling device 850 A may reside external to mobile device manager 810 and mobile device 860 .
- spatial profiling device e.g., 850 A and/or 850 B
- capture signals e.g., 870 A and/or 870 B
- an external object e.g., 880 A and/or 880 B.
- Capture signals may represent the capturing of an image when spatial profiling device (e.g., 850 A and/or 850 B) is in view of a camera. Capture signals (e.g., 870 A and/or 870 B) may employ emitter/detector pairs which utilize emitted infrared, visible, or ultraviolet light to detect proximity. Spatial profiling device (e.g., 850 A and/or 850 B) may utilize a detector to measure the amount of light reflected by external object (e.g., 880 A and/or 880 B) to formulate a spatial profile (e.g., 820 A and/or 820 B).
- FIG. 9 is a flow diagram of avoidance method 900 .
- avoidance method 900 may calibrate a mobile device at 910 .
- a mobile object's initial location may be identified.
- a check may be made to determine whether the mobile device can move freely. If the mobile device cannot move freely, an escape routine may be performed at 940 . If the mobile device can move freely, the location of a mobile object may be detected at 950 and the mobile device may be positioned such that an obstacle is between the mobile device and the mobile object at 960 .
- FIG. 10 is a flow diagram of calibration method 1000 .
- a “forward” command may be received at a mobile device at 1010 .
- the “forward” command may be utilized to move the mobile device.
- the utilization of the “forward” command may cause the mobile device to move in a straight line.
- the distance traveled may be determined employing the mobile device sensing hardware. This determination may be accomplished, for example, employing counts from a wheel encoder.
- distance traveled may be determined employing spatial profiling. This determination may be accomplished employing a spatial profiling device such as a camera. The camera may be onboard the mobile device and/or externally mounted in a manner such that the mobile device is visible to the lens of the camera.
- two measured distances may be compared. According to an embodiment, the comparison may result in a difference between the measured distances.
- the difference between the measured distances may be employed to calibrate the navigation instructions commanding the movement of the mobile device. In an embodiment, calibration may adjust either the instructions used by the spatial profiling device, the instructions used in formulating a navigation instruction, or both.
- FIG. 11 is a flow diagram of a calibration method 1100 .
- a “calibration” command may be received at a mobile device at 1110 .
- the “calibration” command may be utilized to move the mobile device in a predefined path.
- the predefined path may follow a circle, square, a triangle, a combination thereof, and/or the like.
- the distance traveled may be determined employing the mobile device hardware. This determination may be accomplished employing counts from a wheel encoder.
- the distance traveled may be determined employing spatial profiling.
- This determination may be accomplished employing a spatial profiling device such as, for example, a camera which may be onboard the mobile device and/or externally mounted in a manner such that the mobile device is visible to the lens of the camera.
- a spatial profiling device such as, for example, a camera which may be onboard the mobile device and/or externally mounted in a manner such that the mobile device is visible to the lens of the camera.
- the measurement captured employing spatial profiling may provide additional information, which may include information concerning the positioning of the spatial profiling device. For example, moving along a predefined shape may yield differing results between spatial profiling devices that are pointed directly toward the ground versus a spatial profiling device that is tilted.
- the two measured distances may be compared. According to an embodiment, the comparison may result in a difference between the measured distances.
- the difference between the measured distances may be employed to calibrate the instructions used in commanding the movement of the mobile device.
- the calibration may adjust either the instructions employed by the spatial profiling device, the instructions employed in formulating a navigation instruction, or both.
- FIG. 12 is a flow diagram of an obstacle detection method 1200 .
- a spatial profile of an environment may be captured at 1210 .
- a first color may be identified based on the captured spatial profile.
- a second color may be identified based on the captured spatial profile.
- a mobile device may be employed to traverse the first color and interact with the second color.
- the interaction with the second color may occur by commanding the mobile device to drive into the second color.
- a mobile device equipped with proximity sensors may drive close to the edge of the area labeled as the second color.
- the mobile device may interact with the second color employing an emitter/detector pair of infrared light, visible light, ultraviolet light, a combination thereof, and/or the like.
- data may be collected based on the mobile device's interaction with the second color.
- the data collected may be an inertial measurement from accelerometers, gyroscopes, a combination thereof, and/or the like. This data may reflect a collision with the area marked as a second color.
- the detector may or may not detect a reflection.
- the collected data may be employed to determine whether the second color is traversable terrain.
- a sudden change in acceleration may reflect a collision with an obstacle.
- the second color may be labeled as terrain that may not be traversable.
- a strong detected reflection may indicate the presence of an obstacle that may not be traversable while a lack of detected reflection may indicate that an obstacle may not be present and the terrain may be traversable. Both of these embodiments, along with others, may be employed to detect obstacles.
- FIG. 13 is a flow diagram of a mobile object detection method 1300 .
- a first spatial profile may be captured at 1310 .
- a second spatial profile may be captured.
- a third spatial profile may be captured.
- the difference between the second and third spatial profiles may be calculated.
- Motion detection techniques may be employed where, for example, spatial profiles are images.
- the first spatial profile may be compared with the third spatial profile to adaptively identify the foreground. Step 1350 may be repeated in order to adaptively identify the foreground to compare with the background generated from the first spatial profile and other continuously updated first spatial profiles.
- mobile object detection method 1300 may be completed employing computer vision techniques.
- Computer vision techniques may comprise foreground detection techniques such as, for example, background detection, temporal average filters, training times, Gaussian adaptation, 3D data acquisition and reconstruction, a combination thereof, and/or the like.
- FIG. 14 is a flow diagram of an escape method 1400 .
- a mobile device may be temporarily rendered immobile at 1410 .
- a mobile device may be rendered temporarily immobile by a force external to the mobile device such as, for example, a cat grabbing or blocking the mobile device.
- the mobile device may be commanded to struggle at a first defined level to become mobile.
- a check may be performed to determine if the mobile device is able to move freely. If the mobile device is able to move freely, escape method 1400 may end 1440 . If the mobile device is unable to move freely, at 1450 , the mobile device may be commanded to struggle at a second defined level. The first defined level may be greater than the second define level.
- the mobile device may be commanded to cease movement.
- This movement pattern may simulate a “dying” effect seen when a larger animal catches a smaller animal—for example, when a cat catches a mouse.
- the smaller animal may lose life until the smaller animal ceases to move or struggle.
- the larger animal may lose interest in the smaller animal, allowing the smaller animal to become mobile again. This behavior is reflected by checking to see whether the mobile device is free again at 1430 .
- an example system 1500 for implementing some embodiments includes a computing device 1510 .
- Components of computer 1510 may include, but are not limited to, a processing unit 1520 , a system memory 1530 , and a system bus 1521 that may couple various system components including the system memory to the processing unit 1520 .
- Computing device 1510 may comprise a variety of computer readable media.
- Computer readable media may be media that may be accessed by computing device 1510 and may comprise volatile and/or nonvolatile media, and/or removable and/or non-removable media.
- Computer readable media may comprise computer storage media and communication media.
- Computer storage media may comprise volatile and/or nonvolatile, and/or removable and/or non-removable media implemented in a method and/or technology for storage of information such as computer readable instructions, data structures, program modules or other data.
- Computer storage media comprises, but is not limited to, random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory or other memory technology, compact disc read-only memory (CD-ROM), digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computer 1510 .
- Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media.
- modulated data signal means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
- communication media may comprise wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), infrared and other wireless media configured to communicate modulated data signal(s). Combinations of any of the above may also be included within the scope of computer readable media.
- the system memory 1530 may comprise computer storage media in the form of volatile and/or nonvolatile memory such as ROM 1531 and RAM 1532 .
- RAM 1532 may comprise data and/or program modules that may be immediately accessible to and/or presently being operated on by processing unit 1520 .
- FIG. 15 illustrates operating system 1534 , application programs 1535 , other program modules 1536 , and/or program data 1537 that may be stored in RAM 1532 .
- Computing device 1510 may comprise other removable/non-removable volatile/nonvolatile computer storage media.
- FIG. 15 illustrates a hard disk drive 1541 that may read from and/or write to non-removable, nonvolatile magnetic media, a magnetic disk drive 1551 that may read from or writes to a removable, nonvolatile magnetic disk 1552 , a flash drive reader 1557 that may read flash drive 1558 , and an optical disk drive 1555 that may read from or write to a removable, nonvolatile optical disk 1556 such as a Compact Disc Read Only Memory (CD ROM), Digital Versatile Disc (DVD), Blue-ray DiscTM (BD) or other optical media.
- CD ROM Compact Disc Read Only Memory
- DVD Digital Versatile Disc
- BD Blue-ray DiscTM
- removable/non-removable, volatile/nonvolatile computer storage media that may be employed in the example operating environment include, but are not limited to, magnetic tape cassettes, flash memory cards, digital versatile disks, digital video tape, solid state RAM, solid state ROM, and the like.
- the hard disk drive 1541 may be connected to the system bus 1521 through a non-removable memory interface such as interface 1540
- magnetic disk drive 1551 and optical disk drive 1555 may be connected to the system bus 1521 by a removable memory interface, such as interface 1550 .
- the drives and their associated computer storage media discussed above and illustrated in FIG. 15 may provide storage of computer readable instructions, data structures, program modules and other data for the computing device 1510 .
- hard disk drive 1541 is illustrated as storing operating system 1542 , application programs 1543 , program data 1545 , and other program modules 1544 .
- non-volatile memory may include instructions to, for example, discover and configure IT device(s); the creation of device neutral user interface command(s); combinations thereof, and/or the like.
- a user may enter commands and information into computing device 1510 through input devices such as a keyboard 1563 , a microphone 1565 , a camera 1566 , actuator 1567 , and a pointing device 1564 , such as a mouse, trackball, touch pad, and/or a touch screen interface.
- input devices such as a keyboard 1563 , a microphone 1565 , a camera 1566 , actuator 1567 , and a pointing device 1564 , such as a mouse, trackball, touch pad, and/or a touch screen interface.
- input devices such as a keyboard 1563 , a microphone 1565 , a camera 1566 , actuator 1567 , and a pointing device 1564 , such as a mouse, trackball, touch pad, and/or a touch screen interface.
- input interface 1560 may be coupled to the system bus, but may be connected by other interface and bus structures, such as a parallel port, game port or a universal serial bus (USB).
- USB universal serial bus
- Actuator 1567 may be connected to the system bus 1521 via Input Interface 1560 .
- a 3D sensor 1561 may be connected to the system bus 1521 via Input 1560 .
- Examples of 3D sensor(s) 1561 comprise an accelerometer, an inertial navigation unit, a 3D digitizer, and/or the like.
- a modem 1562 may be connected to the system bus 1521 via input interface 1560 .
- Encoder 1568 may be connected to system bus 1521 via input interface 1560 . Encoder 1568 may be coupled to wheels and/or provide rotational data.
- a monitor 1591 or other type of display device may be connected to the system bus 1521 via an interface, such as a video interface 1590 .
- Other devices such as, for example, speakers 1597 and motion drive 1596 may be connected to the system via output interface 1595 .
- Motion drive 1596 may comprise a battery to supply electricity, DC or AC motors, and any necessary control mechanisms such as, for example, an H bridge.
- Computing device 1510 may be operated in a networked environment using logical connections to one or more remote computers, such as a remote computer 1580 .
- the remote computer 1580 may be a personal computer, a mobile device, a hand-held device, a server, a router, a network PC, a medical device, a peer device or other common network node, and may comprise many or all of the elements described above relative to the computing device 1510 .
- the logical connections depicted in FIG. 15 include a local area network (LAN) 1571 and a wide area network (WAN) 1573 , but may also comprise other networks such as, for example, a cellular network.
- LAN local area network
- WAN wide area network
- Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet.
- computing device 1510 When used in a LAN networking environment, computing device 1510 may be connected to the LAN 1571 through a network interface or adapter 1570 .
- computing device 1510 When used in a WAN networking environment, computing device 1510 typically includes a modem 1562 or other means for establishing communications over the WAN 1573 , such as the Internet.
- the modem 1562 which may be internal or external, may be connected to the system bus 1521 via the input interface 1560 , or other appropriate mechanism.
- the modem 1562 may be wired or wireless. Examples of wireless devices may comprise, but are limited to: Wi-Fi, Near-field Communication (NFC) and Bluetooth®.
- program modules depicted relative to computing device 1510 may be stored in a remote computer 1580 .
- FIG. 15 illustrates remote application programs 1585 as residing on remote computer 1580 .
- the network connections shown are exemplary and other means of establishing a communications link between the computers may be used.
- LAN 1571 and WAN 1573 may provide a network interface to communicate with other distributed infrastructure management device(s); with IT device(s); with users remotely accessing input interface 1560 ; combinations thereof, and/or the like.
- Alternative embodiments may comprise include utilizing multiple mobile devices to create a game.
- the game may be played on a tabletop or on the ground.
- the game may involve user control of multiple mobile devices.
- Alternative embodiments may comprise utilizing mobile devices to entertain children and/or adults. Children and/or adults may chase mobile devices.
- the present embodiments should not be limited by any of the above described embodiments.
Landscapes
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Environmental Sciences (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Zoology (AREA)
- Animal Husbandry (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Physical Education & Sports Medicine (AREA)
- Biodiversity & Conservation Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Aviation & Aerospace Engineering (AREA)
- Mathematical Physics (AREA)
- Theoretical Computer Science (AREA)
- Computing Systems (AREA)
- Human Computer Interaction (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
A mobile device comprises a platform, a motion drive, and a spatial profiling device. The motion drive is configured to propel the platform according to navigation instructions. The spatial profiling device is configured to capture a spatial profile of a motion area. A spatial profile is received from the spatial profiling device. A mobile object is located employing the spatial profile. A movement is planned for the mobile device between a first location and a second location in the motion area. The movement is configured to avoid the mobile object. The movement is communicated as navigation instructions to the motion drive.
Description
- This application is a continuation of U.S. patent application Ser. No. 15/405,666, which claims the benefit of U.S. Provisional Application No. 62/278,233, filed Jan. 13, 2016, and U.S. Provisional Application No. 62/357,974, filed Jul. 2, 2016, which are hereby incorporated by reference in their entirety.
- The accompanying figures are included to provide a further understanding, and are incorporated in and constitute a part of this specification. The drawings illustrate one or more embodiments. As such, the disclosure will become more fully understood from the following detailed description, taken in conjunction with the accompanying figures, in which:
-
FIG. 1 is a system diagram of an animal exerciser system as per an aspect of an embodiment of the present invention. -
FIG. 2 is a system diagram of an animal exerciser system as per an aspect of an embodiment of the present invention. -
FIG. 3 is a system diagram of an animal exerciser system as per an aspect of an embodiment of the present invention. -
FIG. 4A is a system diagram of a mobile device as per an aspect of an embodiment of the present invention. -
FIG. 4B is a block diagram of a mobile device platform as per an aspect of an embodiment of the present invention. -
FIG. 5 is a system diagram of an obstacle system as per an aspect of an embodiment of the present invention. -
FIG. 6A is a system diagram of a screen display as per an aspect of an embodiment of the present invention. -
FIG. 6B is a system diagram of a screen display as per an aspect of an embodiment of the present invention. -
FIG. 7 is a flow diagram of an avoidance process as per an aspect of an embodiment of the present invention. -
FIG. 8 is a block diagram of a system interaction as per an aspect of an embodiment of the present invention. -
FIG. 9 is a flow diagram of an avoidance process as per an aspect of an embodiment of the present invention. -
FIG. 10 is a flow diagram of a calibration process as per an aspect of an embodiment of the present invention. -
FIG. 11 is a flow diagram of a calibration process as per an aspect of an embodiment of the present invention. -
FIG. 12 is a flow diagram of an obstacle detection process as per an aspect of an embodiment of the present invention. -
FIG. 13 is a flow diagram of a mobile object detection process as per an aspect of an embodiment of the present invention. -
FIG. 14 is a flow diagram of an escape process as per an aspect of an embodiment of the present invention. -
FIG. 15 is a block diagram of a computer system as per an aspect of an embodiment of the present invention. - Embodiments may be employed to exercise and/or entertain an animal such as a cat and/or a dog.
- Embodiments comprise a system that may comprise a mobile device, a spatial profiling device, and a mobile device manager. The mobile device may comprise a platform, a first communications circuit that receives navigation instructions, and a motion drive configured to propel the platform according to the navigation instructions. The spatial profiling device may capture a spatial profile of a motion area. The mobile device manager may comprise at least one second communications circuit that communicates with the first communications circuit and the spatial profiling device, processor(s), and memory storing processing instructions. The instructions, when executed, may cause the mobile device manager to receive spatial profiles, locate the mobile object, locate the mobile device, plan a movement for the mobile device which avoids the mobile object, and communicate the movement to the mobile device as navigation instructions.
- Embodiments may comprise a process that may comprise receiving a spatial profile from a spatial profiling device, locating a mobile object employing the spatial profiles, locating a mobile device, planning a movement for the mobile device between a first location and a second location, and communicating the movement to the mobile device as navigation instructions employing a communications circuit. The movement may be configured to avoid the mobile object.
- Embodiments may comprise a non-transitory tangible computer readable medium containing instructions configured to cause one or more processors to execute a process comprising: receiving spatial profiles from a spatial profiling device, locating a mobile object employing the spatial profiles, locating a mobile device, planning a movement for the mobile device between a first location and a second location, and communicating the movement to the mobile device as navigation instructions employing a communications circuit. The movement may be configured to avoid the mobile object.
- This disclosure will now be described more fully with reference to the accompanying drawings, in which embodiments of this document are shown. This document should be read to include embodiments of many different forms and should not be construed as being limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concepts contained within this document to those skilled in the art.
- Referring to
FIG. 1 , ananimal exerciser system 100 is shown. According to an embodiment, thesystem 100 may comprise amobile device 140, aspatial profiling device 120, and amobile device manager 110. According to an embodiment, at least themobile device 140 and amobile object 150 may be disposed within amotion area 130. - The
mobile device 140 may further comprise aplatform 142, amotion drive 144, and acommunications circuit 146. In an embodiment, themotion drive 144 andcommunications circuit 146 may be disposed on theplatform 142. -
Motion drive 144 may be configured to receive navigation instructions and to propelplatform 142 according to the navigation instructions. To accomplish this movement,motion drive 144 may comprise a battery to supply electricity, DC or AC motors, and control mechanisms such as, for example, an H bridge.Motion drive 144 may propel the platform using mechanisms such as wheels, flopping wheels, tracks, plungers, legs, magnets, compressed air, a combination thereof, and/or the like. - The at least one
communications circuit 146 may be configured to communicate withcommunications circuit 112 disposed on themobile device manager 110 vianavigation instructions 148. This communication may occur via a wired interface, a wireless interface, a combination thereof, and/or the like. The communication may utilize wireless communication protocols such as Bluetooth®, LTE, Wi-Fi, radio waves, a combination thereof, and/or the like. Thecommunications circuit 146 may send and/or receivenavigation instructions 148. According to thenavigation instructions 148, themotion drive 144 may be configured to propel theplatform 142 to movemobile device 140. - The
spatial profiling device 120 may comprise one or more sensors configured to collect spatial information in and/or around themotion area 130. Examples of sensors comprise cameras, proximity sensors, motion sensors, a combination thereof, and/or the like. Thespatial profiling device 120 may comprise a webcam, infrared projector, 3D scanner systems, such as, for example, a Microsoft Kinect™, a combination thereof, and/or the like. Thespatial profiling device 120 may operate using pairs of emitters and detectors to detect objects. The spatial profiling device may capturespatial profiles 125 of themotion area 130.Spatial profiles 125 may comprise images captured by sensors, such as a camera, or composites of themotion area 130 created using proximity sensors, motion sensors, a combination thereof, and/or the like. Thespatial profiles 125 may be updatable and represent a depiction of themotion area 130. - The
mobile device manager 110 may comprise at least onecommunications circuit 112, one ormore processors 114, andmemory 116. The at least onecommunications circuit 112 may be configured to receivespatial profiles 125 from thespatial profiling device 120. The at least onecommunications circuit 112 may be configured to communicate withcommunications circuit 146 disposed on themobile device 140 vianavigation instructions 148. According to an embodiment, thecommunications circuit 146 may direct the movement ofmobile device 140 employing thenavigation instructions 148.Processors 114 may comprise a microprocessor produced by microprocessor manufacturers such as Advanced Micro Devices, Inc. (AMD) of Sunnyvale, Calif., Atmel Corporation of San Jose, Calif., Intel Corporation of Santa Clara, Calif., or Texas Instruments Inc. of Dallas, Tex.Processors 114 may comprise and/or be other logic-based controllers such as FPGAs or PLCs.Memory 116 may comprise nonvolatile memory configured to store processing instructions. Examples ofmemory 116 comprise ROM, EEPROM, Flash, a combination thereof, and/or the like.Memory 116 may comprise volatile memory such as, for example, RAM. - Contained within
memory 116 may beinstructions 117 that, when executed, may cause themobile device manager 110 to receivespatial profiles 125 from the spatial profiling device and locatemobile object 150 employing thespatial profiles 125.Mobile device manager 110 may locatemobile device 140.Mobile device manager 110 may plan a movement formobile device 140 between a first location and a second location inmotion area 130. The movement may be configured to avoidmobile object 150. The movement may be communicated tomobile device 140 asnavigation instructions 148 by employingcommunications circuit 112. The processinginstructions 117 may cause themobile device manager 110 to plan the movement employing an expected movement ofmobile object 150. Themobile device 140 may be located using infrared light. Themobile device 140 may be located employing a wheel encoder.Mobile device manager 110 may distinguish colors in receivedspatial profiles 125.Mobile device manager 110 may detect a distance of at least one of themobile device 140 and themobile object 150 from a known location in the receivedspatial profiles 125. Themobile object 150 may be located employing motion detection and background detection techniques. - According to an embodiment,
mobile device manager 110 may comprise atouch screen display 118. According to an embodiment, themobile device manager 110 may comprise a device containing a touch screen display such as, for example, a mobile phone, tablet, desktop, a laptop computer, a combination thereof, and/or the like. In embodiments utilizingtouch screen display 118,memory 116 may contain instructions that, when executed, cause themobile device manager 110 to show at least one of the spatial profiles ontouch screen display 118. Themobile device manager 110 may also determine a second location via a selection ontouch screen display 118. - The
mobile object 150 may be an animal such as a cat, a dog, a human, and/or the like. Themobile object 150 may comprise a second mobile device. The second mobile device may be similar tomobile device 140. Essentially,mobile object 150 may be an item that has the capacity to move. - The
motion area 130 may comprise a region of space in which themobile device 140 and/or themobile object 150 may operate. Themobile device 140 and/or themobile object 150 may move inmotion area 130. Further,spatial profiling device 120 may be configured to capture portions ofmotion area 130 and/or the entirety ofmotion area 130. According to an embodiment,motion area 130 may be a space within a home dwelling, a room, and/or the like. - Referring to
FIG. 2 , ananimal exerciser system 200 is shown. According to an embodiment,system 200 may comprise amobile device 240 and anobstacle 260. Aspatial profiling device 220, that may be configured to capture a two or three dimensional spatial profile, may be positioned relative to and/or on theobstacle 260.System 200 may operate withinmotion area 230 and interact withmobile object 250.Mobile object 250 may comprise, as illustrated in this example, a cat. - In
system 200,mobile device 240 may move to avoidmobile object 250 by positioning itself (mobile device 240) such thatobstacle 260 is betweenmobile device 240 andmobile object 250. To accomplish this objective,obstacle 260, which may comprise processing features, may utilize a spatial profile ofmotion area 230. The spatial profile may be created employingspatial profiling device 220. According to an embodiment,obstacle 260 may comprise a preexisting piece of furniture either in a dwelling and/or outdoors. The processing may be performed in an attachment to thepreexisting obstacle 260. According to an embodiment,spatial profiling device 220 may comprise a camera and the spatial profiles captured may comprise images. To capturemotion area 230, thespatial profiling device 220 may utilize a wide-angle lens. In terms of positioning,spatial profiling device 220 may be positioned vertically aboveobstacle 260 so thatspatial profiling device 220 may identifyobstacle 260 withinmotion area 230. According to an embodiment,spatial profiling device 220 may be affixed on an arm that is attached toobstacle 260 at a height that allowsspatial profiling device 220 to obtain a spatial profile ofmotion area 230. According to an embodiment,spatial profiling device 220 may identifymobile device 240 andmobile object 250 using spatial profiling assist equipment. For example,mobile device 240 may emit a beacon using a light emitting diode that may be in the visible and/or infrared spectrum thatspatial profiling device 220 is configured to detect. In terms of trackingmobile object 250, thespatial profiling device 220 may utilize motion detection techniques. Thus, asmobile object 250 moves aroundobstacle 260,mobile device 240 may move to a position such thatobstacle 260 is positioned betweenmobile device 240 andmobile object 250. - Referring to
FIG. 3 , ananimal exerciser system 300 is shown. In an embodiment,system 300 may comprise amobile device 340, anobstacle 360, and aspatial profiling device 320.System 300 may operate withinmotion area 330 and interact with amobile object 350.Mobile object 350 may be an animal such as the illustrated cat. - In contrast to
system 200 depicted inFIG. 2 ,system 300 may utilize aspatial profiling device 320 that is detached fromobstacle 360. According to an embodiment,spatial profiling device 320 may be attached to, for example, the ceiling of a room in order to capturemotion area 330.Spatial profiling device 320 may comprise one or more sensors such as cameras, proximity sensors, motion sensors, a combination thereof, and/or the like.Motion area 330 may be a two and/or three dimensional space.Spatial profiling device 320 may also reside in alternative locations, such as, for example, on a tabletop, a counter, a shelf, other existing furniture within a room, a combination thereof, and/or the like. Further, for outdoor applications,spatial profiling device 320 may be mounted outside of a building, such as on the exterior of a building and/or home. - Referring to
FIG. 4A ,mobile device 440 is shown. According to an embodiment,mobile device 440 may comprise amobile device platform 442,wheels stability nub 430, and atail 460.Mobile device platform 442 may be a surface upon which circuitry may be disposed. For example,platform 442 may be an insulated sheet and/or a type of circuit board.Mobile device platform 442 may be disposed with a plastic casing whilewheels wheels mobile device 440 circuitry and may be utilized, at least in part, to determine distance, speed, acceleration, combination thereof, and/or the like.Stability nub 430 may be positioned to balance the movement ofmobile device 440. According to an embodiment,stability nub 430 may be a wheel. According to an embodiment,stability nub 430 may be a spherically shaped plastic. In terms of operation,platform 442 andwheels mobile device 440 to remain mobile even when in a flipped orientation. That is, themotion drive 444 may be configured to keepmobile device 440 functional and to propelplatform 442 even whenplatform 442 is in a flipped orientation. According to an embodiment,tail 460 may comprise an attachment to entice the attention of a mobile object, such as a cat.Tail 460 may be a string but may also comprise various colors, noise makers such as a bell, other attention generating items, a combination thereof, and/or the like. -
FIG. 4B is a block diagram ofmobile device platform 442. According to an embodiment,platform 442 may comprise acommunications circuit 446 and/or amotion drive 444.Communications circuit 446 may comprise circuitry configured to interface with other components contained onmobile device platform 442.Communication circuit 446 may be configured to communicate with systems external to the mobile device. For example,communications circuit 446 may send data concerningmobile device 440 such as distance measurements, speed measurements, inertial measurements, a combination thereof, and/or the like. Further,communications circuit 446 may be configured to receive instructions from an external system that may direct the movement ofmobile device 440. - Motion drive 444 may be configured to receive navigation instructions and to move
wheels motion drive 444 may comprise a battery to supply electricity, DC or AC motors, and/or control mechanisms such as an H bridge. Motion drive 444 may also operate without employingwheels platform 442 using such mechanisms such as tracks, plungers, legs, magnets, compressed air, a combination thereof, and/or the like. -
Platform 442 may house abeacon 422.Beacon 422 may emit an electromagnetic signal. An electromagnetic signal may comprise a modulated wave or synchronized oscillations of electric and magnetic fields. Examples of electromagnetic signals comprise a signal in the ultraviolet, visible light, infrared, radio wave spectrum, a combination thereof, and/or the like. The signal emitted bybeacon 422 may allow an external imaging device to detectmobile device 440. -
Platform 442 may house an inertial measurement device 424. The inertial measurement device 424 may comprise, for example, a device configured to measure changes in acceleration, magnitude, and/or direction. Examples comprise an accelerometer and/or gyroscope configured to measure changes in acceleration of themobile device 440. This information may be employed to determine the orientation ofmobile device 440, collisions, unlevel terrain, other types of interactions thatmobile device 440 may have with the environment, a combination thereof, and/or the like. -
Platform 442 may operate without reference to an external system and house aspatial profiling device 420 along with amobile device manager 410. This autonomous embodiment may utilizespatial profiling device 420 to generate spatial profile(s) of the environment in whichmobile device 440 operates.Spatial profiling device 420 may comprise a camera mounted such that the lens captures visual information abovemobile device 440. Spatial profiles may be created employing the images captured by the lens.Spatial profiling device 420 may comprise a light emitter and detector pair to generate a spatial profile. In this embodiment, a light emitter such as a light emitting diode may produce electromagnetic waves such as infrared light, ultraviolet light, visible light, a combination thereof, and/or the like. The detector may be, for example, a light emitting diode, a photodiode, a phototransistor, a combination thereof, and/or the like. The detector may be configured to capture reflections of the emitted light, and using the reflections, create a spatial profile of the environment surroundingmobile device 440. That is, the spatial profiles may map the locations of objects as well as predict the location of obstacles. According to an embodiment,spatial profiling device 420 may comprise several emitter/detector pairs. - Utilizing the spatial profiles,
mobile device manager 410 may direct the movement ofmobile device 440 to avoid obstacles as well as other mobile objects that may obstruct the movement ofmobile device 440.Mobile device manager 410 may utilizecommunications circuit 446 to controlmotion drive 444 to execute these maneuvers.Mobile device manager 410, however, may also utilize instructions received by the communications circuit with or without information from thespatial profiling device 420 in order to direct the movement ofmobile device 440.Mobile device 440 may operate without the need for any external navigation instructions but may still have the capability to receive and utilize commands or instructions sent from an external system. - Referring to
FIG. 5 , anobstacle system 500 is shown. In an embodiment,obstacle system 500 may comprise abase obstacle 560, aspatial profiling device 520, amobile device manager 510, and one ormore tunnels 570.Base obstacle 560 may comprise preexisting furniture within a home and/or outdoors, but may also be a specially designed structure. In an embodiment,spatial profiling device 520 may attach tobase obstacle 560 by, for example, an arm that supports thespatial profiling device 520.Spatial profiling device 520 may communicate withmobile device manager 510. Utilizing captured spatial profiles,mobile device manager 510 may detect the locations of mobile objects and/or mobile devices and plot navigation procedures for mobile devices. Further,mobile device manager 510 may be configured to communicate with mobile devices, transmitting and/or receiving information such as navigation instructions, spatial profile information generated by the mobile device, mobile device location information, mobile object location information, a combination thereof, and/or the like. -
Obstacle system 500 may comprise one ormore tunnels 570 that may be utilized by a mobile device. According to an embodiment,tunnels 570 may take various shapes and may be large enough to house the mobile device.Tunnels 570 may also pass completely through or only partially throughbase obstacle 560. - Referring to
FIG. 6A , ascreen display 600 is shown.Screen display 600 may appear on a desktop or laptop computer.Screen display 600 may appear on a device with a touch screen interface such as a mobile phone, a tablet, and/or the like.Screen display 600 may display an image generated from a camera.Screen display 600 may display an image that is being captured in real-time. In an embodiment,screen display 600 may display amobile device 640, anobstacle 660, a startinglocation 680, and afinal location 690. -
Mobile device 640 may be a remotely controlled device. A user may direct the movement ofmobile device 640 based on the visual information provided to the user onscreen display 600. The user may provide an input and specify afinal location 690 for themobile device 640 to move. User input may come from a selection using computer periphery such as a mouse click and/or a tap on a touch screen display. A user could potentially shift the area displayed on the screen, allowing the user to make afinal location 690 selection beyond the initial frame shown.FIG. 6A illustrates a user selection of afinal location 690 on the opposite side ofobstacle 660 relative to the startinglocation 680 of themobile device 640. -
FIG. 6B showsscreen display 600 after themobile device 640 has moved from startinglocation 680 tofinal location 690. To complete this movement,mobile device 640 may followpath 685. In an embodiment,path 685 may be curved and may curve aroundobstacle 660 so thatmobile device 640 may arrive atfinal location 690 without colliding withobstacle 660. Proximity sensors onmobile device 640 may be employed to avoid collision. - Image recognition, employing the imaging device used to generate the image displayed on
screen display 600, may be employed to plan a movement based on visually recognizingobstacle 660. Computation to plan and execute themobile device 640 movement may occur on a mobile device manager that may be disposed on the user input device, themobile device 640, or withinobstacle 660. -
FIG. 7 is a flow diagram of anavoidance method 700. According to an embodiment, spatial profiles may be received from a spatial profiling device at 710. At 720, the received spatial profiles may be employed in locating a mobile object. At 730, a mobile device may be located. At 740, a movement for the mobile device may be planned. The movement may be configured to avoid the mobile object. According to an embodiment, the avoidance may comprise identifying a hiding location that may be outside of the view of the mobile object. In another embodiment, the movement may be planned such that an obstacle may be positioned between the mobile device and the mobile object. At 750, the planned movement may be communicated to the mobile device. -
FIG. 8 is a block diagram ofsystem interaction 800. According to an embodiment, information may flow into and out ofmobile device manager 810. For example, spatial profiles (e.g., 820A and/or 820B) may flow from spatial profiling device (e.g., 850A and/or 850B) intomobile device manager 810. Further,mobile device manager 810 may receive information concerning themobile device location 830 frommobile device 860. In terms of transmitted information,mobile device manager 810 may sendnavigation instructions 840 tomobile device 860. - A spatial profile (e.g., 820 and/or 820B), may be generated employing a spatial profiling device (e.g., 850A and/or 850B).
Spatial profiling device 850B may reside onmobile device 860.Spatial profiling device 850A may reside external tomobile device manager 810 andmobile device 860. In either embodiment, spatial profiling device (e.g., 850A and/or 850B) may utilize capture signals (e.g., 870A and/or 870B) to interface with an external object (e.g., 880A and/or 880B). Capture signals (e.g., 870A and/or 870B) may represent the capturing of an image when spatial profiling device (e.g., 850A and/or 850B) is in view of a camera. Capture signals (e.g., 870A and/or 870B) may employ emitter/detector pairs which utilize emitted infrared, visible, or ultraviolet light to detect proximity. Spatial profiling device (e.g., 850A and/or 850B) may utilize a detector to measure the amount of light reflected by external object (e.g., 880A and/or 880B) to formulate a spatial profile (e.g., 820A and/or 820B). -
FIG. 9 is a flow diagram ofavoidance method 900. According to an embodiment,avoidance method 900 may calibrate a mobile device at 910. At 920, a mobile object's initial location may be identified. At 930, a check may be made to determine whether the mobile device can move freely. If the mobile device cannot move freely, an escape routine may be performed at 940. If the mobile device can move freely, the location of a mobile object may be detected at 950 and the mobile device may be positioned such that an obstacle is between the mobile device and the mobile object at 960. -
FIG. 10 is a flow diagram ofcalibration method 1000. According to an embodiment, a “forward” command may be received at a mobile device at 1010. At 1020, the “forward” command may be utilized to move the mobile device. According to an embodiment, the utilization of the “forward” command may cause the mobile device to move in a straight line. At 1030, the distance traveled may be determined employing the mobile device sensing hardware. This determination may be accomplished, for example, employing counts from a wheel encoder. At 1040, distance traveled may be determined employing spatial profiling. This determination may be accomplished employing a spatial profiling device such as a camera. The camera may be onboard the mobile device and/or externally mounted in a manner such that the mobile device is visible to the lens of the camera. At 1050, two measured distances may be compared. According to an embodiment, the comparison may result in a difference between the measured distances. At 1060, the difference between the measured distances may be employed to calibrate the navigation instructions commanding the movement of the mobile device. In an embodiment, calibration may adjust either the instructions used by the spatial profiling device, the instructions used in formulating a navigation instruction, or both. -
FIG. 11 is a flow diagram of acalibration method 1100. According to an embodiment, a “calibration” command may be received at a mobile device at 1110. At 1120, the “calibration” command may be utilized to move the mobile device in a predefined path. In an embodiment, the predefined path may follow a circle, square, a triangle, a combination thereof, and/or the like. At 1130, the distance traveled may be determined employing the mobile device hardware. This determination may be accomplished employing counts from a wheel encoder. At 1140, the distance traveled may be determined employing spatial profiling. This determination may be accomplished employing a spatial profiling device such as, for example, a camera which may be onboard the mobile device and/or externally mounted in a manner such that the mobile device is visible to the lens of the camera. By moving along a predefined path, the measurement captured employing spatial profiling may provide additional information, which may include information concerning the positioning of the spatial profiling device. For example, moving along a predefined shape may yield differing results between spatial profiling devices that are pointed directly toward the ground versus a spatial profiling device that is tilted. At 1150, the two measured distances may be compared. According to an embodiment, the comparison may result in a difference between the measured distances. At 1160, the difference between the measured distances may be employed to calibrate the instructions used in commanding the movement of the mobile device. According to an embodiment, the calibration may adjust either the instructions employed by the spatial profiling device, the instructions employed in formulating a navigation instruction, or both. -
FIG. 12 is a flow diagram of anobstacle detection method 1200. According to an embodiment, a spatial profile of an environment may be captured at 1210. At 1220, a first color may be identified based on the captured spatial profile. At 1230, a second color may be identified based on the captured spatial profile. At 1240, a mobile device may be employed to traverse the first color and interact with the second color. According to an embodiment, the interaction with the second color may occur by commanding the mobile device to drive into the second color. In another embodiment, a mobile device equipped with proximity sensors may drive close to the edge of the area labeled as the second color. The mobile device may interact with the second color employing an emitter/detector pair of infrared light, visible light, ultraviolet light, a combination thereof, and/or the like. At 1250, data may be collected based on the mobile device's interaction with the second color. In an embodiment where the mobile device is driven into the second color, the data collected may be an inertial measurement from accelerometers, gyroscopes, a combination thereof, and/or the like. This data may reflect a collision with the area marked as a second color. In the embodiment where the interaction occurs based on an emitter/detector pair, the detector may or may not detect a reflection. At 1260, the collected data may be employed to determine whether the second color is traversable terrain. In an embodiment where the mobile device is driven into the second color, a sudden change in acceleration may reflect a collision with an obstacle. In this case, the second color may be labeled as terrain that may not be traversable. In an embodiment where the interaction occurs based on an emitter/detector pair, a strong detected reflection may indicate the presence of an obstacle that may not be traversable while a lack of detected reflection may indicate that an obstacle may not be present and the terrain may be traversable. Both of these embodiments, along with others, may be employed to detect obstacles. -
FIG. 13 is a flow diagram of a mobileobject detection method 1300. According to an embodiment, a first spatial profile may be captured at 1310. At 1320, a second spatial profile may be captured. At 1330, a third spatial profile may be captured. At 1340, the difference between the second and third spatial profiles may be calculated. Motion detection techniques may be employed where, for example, spatial profiles are images. At 1350, the first spatial profile may be compared with the third spatial profile to adaptively identify the foreground.Step 1350 may be repeated in order to adaptively identify the foreground to compare with the background generated from the first spatial profile and other continuously updated first spatial profiles. According to an embodiment, mobileobject detection method 1300 may be completed employing computer vision techniques. Computer vision techniques may comprise foreground detection techniques such as, for example, background detection, temporal average filters, training times, Gaussian adaptation, 3D data acquisition and reconstruction, a combination thereof, and/or the like. -
FIG. 14 is a flow diagram of anescape method 1400. According to an embodiment, a mobile device may be temporarily rendered immobile at 1410. A mobile device may be rendered temporarily immobile by a force external to the mobile device such as, for example, a cat grabbing or blocking the mobile device. At 1420, the mobile device may be commanded to struggle at a first defined level to become mobile. At 1430, a check may be performed to determine if the mobile device is able to move freely. If the mobile device is able to move freely,escape method 1400 may end 1440. If the mobile device is unable to move freely, at 1450, the mobile device may be commanded to struggle at a second defined level. The first defined level may be greater than the second define level. At 1460, the mobile device may be commanded to cease movement. This movement pattern may simulate a “dying” effect seen when a larger animal catches a smaller animal—for example, when a cat catches a mouse. As the smaller animal is caught, the smaller animal may lose life until the smaller animal ceases to move or struggle. After recognizing this behavior, the larger animal may lose interest in the smaller animal, allowing the smaller animal to become mobile again. This behavior is reflected by checking to see whether the mobile device is free again at 1430. - Referring to
FIG. 15 , anexample system 1500 for implementing some embodiments includes acomputing device 1510. Components ofcomputer 1510 may include, but are not limited to, aprocessing unit 1520, asystem memory 1530, and asystem bus 1521 that may couple various system components including the system memory to theprocessing unit 1520. -
Computing device 1510 may comprise a variety of computer readable media. Computer readable media may be media that may be accessed bycomputing device 1510 and may comprise volatile and/or nonvolatile media, and/or removable and/or non-removable media. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer storage media may comprise volatile and/or nonvolatile, and/or removable and/or non-removable media implemented in a method and/or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media comprises, but is not limited to, random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory or other memory technology, compact disc read-only memory (CD-ROM), digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed bycomputer 1510. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media may comprise wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), infrared and other wireless media configured to communicate modulated data signal(s). Combinations of any of the above may also be included within the scope of computer readable media. - The
system memory 1530 may comprise computer storage media in the form of volatile and/or nonvolatile memory such asROM 1531 andRAM 1532. A basic input/output system (BIOS) and/or extensible Firmware Interface (EFI) 1533 comprising basic routines that may help to transfer information between elements withincomputer 1510, such as during start-up, may be stored inROM 1531.RAM 1532 may comprise data and/or program modules that may be immediately accessible to and/or presently being operated on byprocessing unit 1520. By way of example, and not limitation,FIG. 15 illustratesoperating system 1534,application programs 1535,other program modules 1536, and/orprogram data 1537 that may be stored inRAM 1532. -
Computing device 1510 may comprise other removable/non-removable volatile/nonvolatile computer storage media. By way of example,FIG. 15 illustrates ahard disk drive 1541 that may read from and/or write to non-removable, nonvolatile magnetic media, amagnetic disk drive 1551 that may read from or writes to a removable, nonvolatilemagnetic disk 1552, a flash drive reader 1557 that may read flash drive 1558, and anoptical disk drive 1555 that may read from or write to a removable, nonvolatileoptical disk 1556 such as a Compact Disc Read Only Memory (CD ROM), Digital Versatile Disc (DVD), Blue-ray Disc™ (BD) or other optical media. Other removable/non-removable, volatile/nonvolatile computer storage media that may be employed in the example operating environment include, but are not limited to, magnetic tape cassettes, flash memory cards, digital versatile disks, digital video tape, solid state RAM, solid state ROM, and the like. Thehard disk drive 1541 may be connected to thesystem bus 1521 through a non-removable memory interface such asinterface 1540, andmagnetic disk drive 1551 andoptical disk drive 1555 may be connected to thesystem bus 1521 by a removable memory interface, such asinterface 1550. - The drives and their associated computer storage media discussed above and illustrated in
FIG. 15 may provide storage of computer readable instructions, data structures, program modules and other data for thecomputing device 1510. InFIG. 15 , for example,hard disk drive 1541 is illustrated as storingoperating system 1542,application programs 1543,program data 1545, andother program modules 1544. Additionally, for example, non-volatile memory may include instructions to, for example, discover and configure IT device(s); the creation of device neutral user interface command(s); combinations thereof, and/or the like. - A user may enter commands and information into
computing device 1510 through input devices such as akeyboard 1563, a microphone 1565, acamera 1566,actuator 1567, and apointing device 1564, such as a mouse, trackball, touch pad, and/or a touch screen interface. These and other input devices may be connected to theprocessing unit 1520 through ainput interface 1560 that may be coupled to the system bus, but may be connected by other interface and bus structures, such as a parallel port, game port or a universal serial bus (USB). -
Actuator 1567 may be connected to thesystem bus 1521 viaInput Interface 1560. A3D sensor 1561 may be connected to thesystem bus 1521 viaInput 1560. Examples of 3D sensor(s) 1561 comprise an accelerometer, an inertial navigation unit, a 3D digitizer, and/or the like. Amodem 1562 may be connected to thesystem bus 1521 viainput interface 1560. - Encoder 1568 may be connected to
system bus 1521 viainput interface 1560. Encoder 1568 may be coupled to wheels and/or provide rotational data. - A
monitor 1591 or other type of display device may be connected to thesystem bus 1521 via an interface, such as avideo interface 1590. Other devices, such as, for example,speakers 1597 andmotion drive 1596 may be connected to the system viaoutput interface 1595.Motion drive 1596 may comprise a battery to supply electricity, DC or AC motors, and any necessary control mechanisms such as, for example, an H bridge. -
Computing device 1510 may be operated in a networked environment using logical connections to one or more remote computers, such as aremote computer 1580. Theremote computer 1580 may be a personal computer, a mobile device, a hand-held device, a server, a router, a network PC, a medical device, a peer device or other common network node, and may comprise many or all of the elements described above relative to thecomputing device 1510. The logical connections depicted inFIG. 15 include a local area network (LAN) 1571 and a wide area network (WAN) 1573, but may also comprise other networks such as, for example, a cellular network. Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet. - When used in a LAN networking environment,
computing device 1510 may be connected to theLAN 1571 through a network interface oradapter 1570. When used in a WAN networking environment,computing device 1510 typically includes amodem 1562 or other means for establishing communications over theWAN 1573, such as the Internet. Themodem 1562, which may be internal or external, may be connected to thesystem bus 1521 via theinput interface 1560, or other appropriate mechanism. Themodem 1562 may be wired or wireless. Examples of wireless devices may comprise, but are limited to: Wi-Fi, Near-field Communication (NFC) and Bluetooth®. In a networked environment, program modules depicted relative tocomputing device 1510, or portions thereof, may be stored in aremote computer 1580. By way of example, and not limitation,FIG. 15 illustratesremote application programs 1585 as residing onremote computer 1580. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used. Additionally, for example,LAN 1571 andWAN 1573 may provide a network interface to communicate with other distributed infrastructure management device(s); with IT device(s); with users remotely accessinginput interface 1560; combinations thereof, and/or the like. - While various embodiments have been described above, it should be understood that they have been presented by way of example, and not limitation. It will be apparent to persons skilled in the relevant art(s) that various changes in form and detail can be made therein without departing from the spirit and scope. In fact, after reading the above description, it will be apparent to one skilled in the relevant art(s) how to implement alternative embodiments. Alternative embodiments may comprise include utilizing multiple mobile devices to create a game. The game may be played on a tabletop or on the ground. The game may involve user control of multiple mobile devices. Alternative embodiments may comprise utilizing mobile devices to entertain children and/or adults. Children and/or adults may chase mobile devices. Thus, the present embodiments should not be limited by any of the above described embodiments.
- In addition, it should be understood that the figures and algorithms, which highlight the functionality and advantages of the present invention, are presented for example purposes only. The architecture of the present invention is sufficiently flexible and configurable, such that it may be utilized in ways other than that shown in the accompanying figures and algorithms. For example, the steps listed in any flowchart may be re-ordered or only optionally used in some embodiments.
- It should be noted the terms “including” and “comprising” should be interpreted as meaning “including, but not limited to”.
- In this specification, “a” and “an” and similar phrases are to be interpreted as “at least one” and “one or more.” References to “the,” “said,” and similar phrases should be interpreted as “the at least one”, “said at least one”, etc. References to “an” embodiment in this disclosure are not necessarily to the same embodiment.
- It is the applicant's intent that only claims that include the express language “means for” or “step for” be interpreted under 35 U.S.C. 112. Claims that do not expressly include the phrase “means for” or “step for” are not to be interpreted under 35 U.S.C. 112.
- The disclosure of this patent document incorporates material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent file or records, for the limited purposes required by law, but otherwise reserves all copyright rights whatsoever.
- Further, the purpose of the Abstract of the Disclosure is to enable the U.S. Patent and Trademark Office and the public generally, and especially the scientists, engineers and practitioners in the art who are not familiar with patent or legal terms or phraseology, to determine quickly from a cursory inspection the nature and essence of the technical disclosure of the application. The Abstract of the Disclosure is not intended to be limiting as to the scope in any way.
Claims (20)
1. A mobile device, comprising:
a platform;
a motion drive configured to propel the platform according to navigation instructions;
a spatial profiling device configured to capture a spatial profile of a motion area;
one or more processors; and
memory storing processing instructions that, when executed, cause the mobile device to:
receive a spatial profile from the spatial profiling device;
locate a mobile object employing the spatial profile;
locate the mobile device;
plan a movement for the mobile device between a first location and a second location in the motion area, the movement configured to avoid the mobile object; and
communicate the movement as navigation instructions to the motion drive.
2. The mobile device according to claim 1 , further comprising an inertial measurement device.
3. The mobile device according to claim 2 , wherein the processing instructions, when executed, further cause the mobile device to determine, employing the inertial measurement device, the orientation of mobile device.
4. The mobile device according to claim 1 , wherein the motion drive is configured to propel the platform in a flipped orientation.
5. The mobile device according to claim 1 , further comprising a tail.
6. The mobile device according to claim 1 , further comprising a stability nub.
7. The mobile device according to claim 1 , wherein the spatial profiling device is affixed on least one of the following:
a ceiling;
a wall; or
a countertop.
8. The mobile device according to claim 1 , wherein the mobile object is at least one of the following:
an animal;
a person;
a cat; or
a second mobile device.
9. The mobile device according to claim 1 , wherein the mobile device further comprises a touch screen display and the processing instructions, when executed, further cause the mobile device manager to:
show at least one spatial profile on the touch screen display; and
determine the second location via a selection on the touch screen display.
10. The mobile device according to claim 1 , wherein the processing instructions, when executed, further cause the mobile device to plan the movement employing an expected movement of the mobile object.
11. The mobile device according to claim 1 , wherein the processing instructions, when executed, cause the mobile device to locate the mobile device employing infrared light.
12. The mobile device according to claim 1 , further comprising a wheel encoder.
13. The mobile device according to claim 12 , wherein the processing instructions, when executed, cause the mobile device to locate the mobile device employing the wheel encoder.
14. The mobile device according to claim 1 , further comprising a camera.
15. The mobile device according to claim 1 , further comprising a proximity sensor.
16. The mobile device according to claim 1 , further comprising a motion sensor.
17. The mobile device according to claim 1 , further comprising at least one of an emitter and a detector.
18. The mobile device according to claim 1 , wherein the processing instructions, when executed, further cause the mobile device to detect a distance of at least one of the mobile device and the mobile object from a known location in the spatial profile.
19. The mobile device according to claim 1 , wherein the processing instructions, when executed, cause the mobile device to locate the mobile object employing motion detection and background detection.
20. The mobile device according to claim 1 , wherein the spatial profiling device is disposed on the platform.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/041,855 US20190000041A1 (en) | 2016-01-13 | 2018-07-23 | Mobile Object Avoiding Mobile Platform |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201662278233P | 2016-01-13 | 2016-01-13 | |
US201662357974P | 2016-07-02 | 2016-07-02 | |
US15/405,666 US10051839B2 (en) | 2016-01-13 | 2017-01-13 | Animal exerciser system |
US16/041,855 US20190000041A1 (en) | 2016-01-13 | 2018-07-23 | Mobile Object Avoiding Mobile Platform |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/405,666 Continuation US10051839B2 (en) | 2016-01-13 | 2017-01-13 | Animal exerciser system |
Publications (1)
Publication Number | Publication Date |
---|---|
US20190000041A1 true US20190000041A1 (en) | 2019-01-03 |
Family
ID=59274668
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/405,666 Active 2037-02-26 US10051839B2 (en) | 2016-01-13 | 2017-01-13 | Animal exerciser system |
US16/041,855 Abandoned US20190000041A1 (en) | 2016-01-13 | 2018-07-23 | Mobile Object Avoiding Mobile Platform |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/405,666 Active 2037-02-26 US10051839B2 (en) | 2016-01-13 | 2017-01-13 | Animal exerciser system |
Country Status (1)
Country | Link |
---|---|
US (2) | US10051839B2 (en) |
Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5400244A (en) * | 1991-06-25 | 1995-03-21 | Kabushiki Kaisha Toshiba | Running control system for mobile robot provided with multiple sensor information integration system |
US20050105769A1 (en) * | 2003-11-19 | 2005-05-19 | Sloan Alan D. | Toy having image comprehension |
US20050273967A1 (en) * | 2004-03-11 | 2005-12-15 | Taylor Charles E | Robot vacuum with boundary cones |
US20070192910A1 (en) * | 2005-09-30 | 2007-08-16 | Clara Vu | Companion robot for personal interaction |
US20090228166A1 (en) * | 2006-01-18 | 2009-09-10 | I-Guide, Llc | Robotic Vehicle Controller |
US20090281661A1 (en) * | 2008-04-24 | 2009-11-12 | Evolution Robotics | Application of localization, positioning & navigation systems for robotic enabled mobile products |
US20100261406A1 (en) * | 2009-04-13 | 2010-10-14 | James Russell Hornsby | Interactive Intelligent Toy |
US20110202175A1 (en) * | 2008-04-24 | 2011-08-18 | Nikolai Romanov | Mobile robot for cleaning |
US20110269374A1 (en) * | 2009-04-13 | 2011-11-03 | James Russell Hornsby | Powered Hub Device for Use with Motorized Toy |
US20120316680A1 (en) * | 2011-06-13 | 2012-12-13 | Microsoft Corporation | Tracking and following of moving objects by a mobile robot |
US8525836B1 (en) * | 2012-02-07 | 2013-09-03 | Google Inc. | Systems and methods for representing information associated with objects in an area |
US20150012163A1 (en) * | 2013-07-02 | 2015-01-08 | David Crawley | Autonomous mobile platform for service applications |
US20150212521A1 (en) * | 2013-05-23 | 2015-07-30 | Irobot Corporation | Simultaneous Localization And Mapping For A Mobile Robot |
US20160358477A1 (en) * | 2015-06-05 | 2016-12-08 | Arafat M.A. ANSARI | Smart vehicle |
US20160360730A1 (en) * | 2015-06-02 | 2016-12-15 | Jiangsu Favorite Leisure Articles Co., Ltd. | Configuration-variable remote-controlled pet toy |
US20170097232A1 (en) * | 2015-10-03 | 2017-04-06 | X Development Llc | Using sensor-based observations of agents in an environment to estimate the pose of an object in the environment and to estimate an uncertainty measure for the pose |
US9717387B1 (en) * | 2015-02-26 | 2017-08-01 | Brain Corporation | Apparatus and methods for programming and training of robotic household appliances |
US20170251633A1 (en) * | 2012-09-19 | 2017-09-07 | Krystalka R. Womble | Method and System for Remote Monitoring, Care and Maintenance of Animals |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5651049A (en) * | 1994-08-30 | 1997-07-22 | Harris Corporation | RF connected message recording device and method for a telephone system |
IL127569A0 (en) * | 1998-09-16 | 1999-10-28 | Comsense Technologies Ltd | Interactive toys |
US6773344B1 (en) * | 2000-03-16 | 2004-08-10 | Creator Ltd. | Methods and apparatus for integration of interactive toys with interactive television and cellular communication systems |
US20100005700A1 (en) * | 2008-07-11 | 2010-01-14 | Kenneth Dale Thomas | Robotic fishing lure |
US9039482B2 (en) * | 2010-07-29 | 2015-05-26 | Dialware Inc. | Interactive toy apparatus and method of using same |
US10555511B2 (en) * | 2013-09-24 | 2020-02-11 | Gerald Vashina | Steerable fishing lure |
US9335764B2 (en) * | 2014-05-27 | 2016-05-10 | Recreational Drone Event Systems, Llc | Virtual and augmented reality cockpit and operational control systems |
-
2017
- 2017-01-13 US US15/405,666 patent/US10051839B2/en active Active
-
2018
- 2018-07-23 US US16/041,855 patent/US20190000041A1/en not_active Abandoned
Patent Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5400244A (en) * | 1991-06-25 | 1995-03-21 | Kabushiki Kaisha Toshiba | Running control system for mobile robot provided with multiple sensor information integration system |
US20050105769A1 (en) * | 2003-11-19 | 2005-05-19 | Sloan Alan D. | Toy having image comprehension |
US20050273967A1 (en) * | 2004-03-11 | 2005-12-15 | Taylor Charles E | Robot vacuum with boundary cones |
US20070192910A1 (en) * | 2005-09-30 | 2007-08-16 | Clara Vu | Companion robot for personal interaction |
US20090228166A1 (en) * | 2006-01-18 | 2009-09-10 | I-Guide, Llc | Robotic Vehicle Controller |
US20110202175A1 (en) * | 2008-04-24 | 2011-08-18 | Nikolai Romanov | Mobile robot for cleaning |
US20090281661A1 (en) * | 2008-04-24 | 2009-11-12 | Evolution Robotics | Application of localization, positioning & navigation systems for robotic enabled mobile products |
US20110269374A1 (en) * | 2009-04-13 | 2011-11-03 | James Russell Hornsby | Powered Hub Device for Use with Motorized Toy |
US20100261406A1 (en) * | 2009-04-13 | 2010-10-14 | James Russell Hornsby | Interactive Intelligent Toy |
US20120316680A1 (en) * | 2011-06-13 | 2012-12-13 | Microsoft Corporation | Tracking and following of moving objects by a mobile robot |
US8525836B1 (en) * | 2012-02-07 | 2013-09-03 | Google Inc. | Systems and methods for representing information associated with objects in an area |
US20170251633A1 (en) * | 2012-09-19 | 2017-09-07 | Krystalka R. Womble | Method and System for Remote Monitoring, Care and Maintenance of Animals |
US20150212521A1 (en) * | 2013-05-23 | 2015-07-30 | Irobot Corporation | Simultaneous Localization And Mapping For A Mobile Robot |
US20150012163A1 (en) * | 2013-07-02 | 2015-01-08 | David Crawley | Autonomous mobile platform for service applications |
US9717387B1 (en) * | 2015-02-26 | 2017-08-01 | Brain Corporation | Apparatus and methods for programming and training of robotic household appliances |
US20160360730A1 (en) * | 2015-06-02 | 2016-12-15 | Jiangsu Favorite Leisure Articles Co., Ltd. | Configuration-variable remote-controlled pet toy |
US20160358477A1 (en) * | 2015-06-05 | 2016-12-08 | Arafat M.A. ANSARI | Smart vehicle |
US20170097232A1 (en) * | 2015-10-03 | 2017-04-06 | X Development Llc | Using sensor-based observations of agents in an environment to estimate the pose of an object in the environment and to estimate an uncertainty measure for the pose |
Also Published As
Publication number | Publication date |
---|---|
US10051839B2 (en) | 2018-08-21 |
US20170196199A1 (en) | 2017-07-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109998429B (en) | Mobile cleaning robot artificial intelligence for context awareness | |
US11669086B2 (en) | Mobile robot cleaning system | |
US11160432B2 (en) | System for spot cleaning by a mobile robot | |
CN109998421B (en) | Mobile cleaning robot assembly and durable mapping | |
US10824773B2 (en) | System and method of scanning an environment and generating two dimensional images of the environment | |
KR102608046B1 (en) | Guidance robot for airport and method thereof | |
JP7377837B2 (en) | Method and system for generating detailed environmental data sets through gameplay | |
JP5629390B2 (en) | Mobile robot system | |
KR102018763B1 (en) | Interfacing with a mobile telepresence robot | |
US10657691B2 (en) | System and method of automatic room segmentation for two-dimensional floorplan annotation | |
EP3548993A1 (en) | Virtual sensor configuration | |
US10397527B2 (en) | Remotely controlled robotic sensor ball | |
Ye et al. | 6-DOF pose estimation of a robotic navigation aid by tracking visual and geometric features | |
CN108106605A (en) | Depth transducer control based on context | |
CN112867424A (en) | Navigation and cleaning area dividing method and system, and moving and cleaning robot | |
GB2527207A (en) | Mobile human interface robot | |
KR20190134971A (en) | A plurality of autonomous mobile robots and a controlling method for the same | |
CN112204345A (en) | Indoor positioning method of mobile equipment, mobile equipment and control system | |
EP3527939A1 (en) | A system and method of on-site documentation enhancement through augmented reality | |
US11009887B2 (en) | Systems and methods for remote visual inspection of a closed space | |
US10051839B2 (en) | Animal exerciser system | |
KR20220101537A (en) | Cleaning robot and controlling method thereof | |
CIUFFREDA | Analisy and development of robot on board sensors network to localize people in indoor living environments | |
KR20240057297A (en) | Method and electronic device for training nueral network model | |
Ferent | Sensor Based Navigation for Mobile Robots |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: PETRONICS INC., ILLINOIS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:JUN, DAVID;FRIEDMAN, MICHAEL;COHEN, DAVID;REEL/FRAME:051055/0747 Effective date: 20180430 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |