[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2020021873A1 - Processing device, processing method and program - Google Patents

Processing device, processing method and program Download PDF

Info

Publication number
WO2020021873A1
WO2020021873A1 PCT/JP2019/022386 JP2019022386W WO2020021873A1 WO 2020021873 A1 WO2020021873 A1 WO 2020021873A1 JP 2019022386 W JP2019022386 W JP 2019022386W WO 2020021873 A1 WO2020021873 A1 WO 2020021873A1
Authority
WO
WIPO (PCT)
Prior art keywords
processing device
measurement target
measurement
subject
state
Prior art date
Application number
PCT/JP2019/022386
Other languages
French (fr)
Japanese (ja)
Inventor
英佑 織戸
克幸 永井
宏紀 寺島
重樹 泉
祐輔 佐藤
Original Assignee
日本電気株式会社
学校法人法政大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気株式会社, 学校法人法政大学 filed Critical 日本電気株式会社
Priority to JP2020532197A priority Critical patent/JPWO2020021873A1/en
Publication of WO2020021873A1 publication Critical patent/WO2020021873A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion

Definitions

  • the present invention relates to a processing device, a processing method, and a program.
  • Patent Documents 1 and 2 disclose techniques related to medical checks.
  • Patent Document 1 discloses that when a subject performs upper limb exercise, a position measuring device specifies the position of the tip of the subject's hand, and the hand movable range measuring means calculates the hand movable range based on the result of the specification. It has been disclosed.
  • Patent Document 2 discloses a device that stores information on the posture of a subject when the subject detects pain or discomfort during the operation.
  • Patent Document 2 merely stores information on the posture of the subject when the subject feels pain or discomfort during operation, and is not a technique for measuring the movable range of a predetermined part of the subject.
  • the object of the present invention is to make it possible to measure the range of motion of a plurality of parts of the body and the plurality of movement directions without special knowledge.
  • Selecting means for selecting a measurement object of the range of motion A calculation unit that calculates a measurement value indicating a movable range of the selected measurement target by analyzing an image obtained by capturing the subject by an analysis method corresponding to the selected measurement target, Is provided.
  • Computer A selection step of selecting a measurement object of the range of motion, A calculation step of calculating a measurement value indicating a movable range of the selected measurement target, by analyzing an image obtained by capturing the subject by an analysis method corresponding to the selected measurement target, Is provided.
  • FIG. 5 is a diagram for explaining an example of a method of using the processing device 10 of the present embodiment.
  • FIG. 2 is a diagram illustrating an example of a hardware configuration of a processing apparatus 10 according to the embodiment.
  • FIG. 2 is a diagram illustrating an example of a functional block diagram of a processing device 10 according to the embodiment.
  • FIG. 4 is a diagram illustrating an example of a screen output by the processing device 10 of the embodiment.
  • FIG. 4 is a diagram illustrating an example of a screen output by the processing device 10 of the embodiment.
  • 5 is a flowchart illustrating an example of a processing flow of the processing device 10 of the present embodiment.
  • FIG. 4 is a diagram schematically illustrating an example of information stored in a processing device 10 according to the embodiment.
  • FIG. 5 is a flowchart illustrating an example of a processing flow of the processing device 10 of the present embodiment.
  • FIG. 2 is a diagram illustrating an example of a functional block diagram of a processing device 10 according to the embodiment.
  • 5 is a flowchart illustrating an example of a processing flow of the processing device 10 of the present embodiment.
  • FIG. 2 is a diagram illustrating an example of a functional block diagram of a processing device 10 according to the embodiment.
  • FIG. 4 is a diagram illustrating an example of a screen output by the processing device 10 of the embodiment.
  • FIG. 4 is a diagram illustrating an example of a screen output by the processing device 10 of the embodiment.
  • FIG. 4 is a diagram illustrating an example of a screen output by the processing device 10 of the embodiment.
  • FIG. 4 is a diagram illustrating an example of a screen output by the processing device 10 of the embodiment.
  • FIG. 4 is a diagram illustrating an example of a screen output by the processing device 10 of the embodiment.
  • FIG. 2 is a diagram illustrating an example of a functional block diagram of a processing device 10 according to the embodiment.
  • FIG. 4 is a diagram schematically illustrating an example of information stored in a processing device 10 according to the embodiment.
  • FIG. 9 is a diagram for describing an example of a prohibition operation.
  • FIG. 9 is a diagram for describing an example of a prohibition operation.
  • FIG. 9 is a diagram for describing an example of a prohibition operation.
  • FIG. 9 is a diagram for describing an example of a prohibition operation.
  • FIG. 9 is a diagram for describing an example of a prohibition operation.
  • FIG. 9 is a diagram for describing an example of a prohibition operation.
  • FIG. 9 is a diagram for describing an example of a prohibition operation.
  • the three-dimensional camera 200 can acquire distance information in addition to color information for each pixel.
  • the three-dimensional camera 200 is, for example, a kinect.
  • the processing device can measure the movable range of a plurality of measurement targets.
  • the plurality of measurement targets differ from each other in at least one of the evaluation site and the operation direction of the site.
  • An example of the plurality of measurement objects will be described in detail in the following embodiments.
  • a method for analyzing image data is prepared in advance for each measurement object, and the algorithm is stored in the processing device.
  • the algorithm corresponding to each of the plurality of measurement targets specifies a predetermined part of the body (eg, elbow, shoulder), and is common in that a value indicating the range of motion is calculated based on the specified part. Values (eg, angles, distances, etc.) calculated based on the specified parts are different from each other.
  • An example of the algorithm will be described in the following embodiment.
  • the processing device When selecting the measurement target of the movable range based on the user input, the processing device analyzes the input image data by an analysis method corresponding to the selected measurement target, and calculates a measurement value indicating the movable range of the selected measurement target. .
  • the functional units included in the processing apparatus of the present embodiment include a CPU (Central Processing Unit) of an arbitrary computer, a memory, a program to be loaded into the memory, and a storage unit such as a hard disk for storing the program (a stage in which the apparatus is shipped in advance).
  • a storage unit such as a hard disk for storing the program (a stage in which the apparatus is shipped in advance).
  • programs stored from the Internet it can also store storage media such as CDs (Compact Discs) and programs downloaded from servers on the Internet.) It is realized by a combination. It will be understood by those skilled in the art that there are various modifications in the method and apparatus for realizing the method.
  • FIG. 2 is a block diagram illustrating a hardware configuration of the processing apparatus according to the present embodiment.
  • the processing device has a processor 1A, a memory 2A, an input / output interface 3A, a peripheral circuit 4A, and a bus 5A.
  • the peripheral circuit 4A includes various modules.
  • the processing device may not have the peripheral circuit 4A.
  • the processing device may be constituted by a plurality of physically separated devices. In this case, each device can have the above hardware configuration.
  • the bus 5A is a data transmission path through which the processor 1A, the memory 2A, the peripheral circuit 4A, and the input / output interface 3A mutually transmit and receive data.
  • the processor 1A is an arithmetic processing device such as a CPU and a GPU (Graphics Processing Unit).
  • the memory 2A is a memory such as a RAM (Random Access Memory) or a ROM (Read Only Memory).
  • the input / output interface 3A is an interface for acquiring information from an input device, an external device, an external server, an external sensor (three-dimensional camera), or an interface for outputting information to an output device, an external device, an external server, or the like. Including.
  • the input device is, for example, a keyboard, a mouse, a microphone, and the like.
  • the output device is, for example, a display, a speaker, a printer, a mailer, or the like.
  • the processor 1A can issue a command to each module and perform a calculation based on the calculation
  • FIG. 3 shows an example of a functional block diagram of the processing device 10.
  • the processing device 10 includes a selection unit 11 and a calculation unit 12.
  • the selection unit 11 selects a measurement object of the movable range based on a user input. For example, the selection unit 11 outputs a UI (user @ interface) screen for selecting a measurement target to a display, and receives a user input from the UI screen.
  • UI user @ interface
  • the selection unit 11 may output a UI screen from which “all item measurement” and “individual measurement” can be selected.
  • the selection unit 11 selects all the measurement targets one by one in a predetermined order.
  • the selection unit 11 when “individual measurement” is designated by a user input, the selection unit 11 outputs a UI screen that lists a plurality of measurement targets in a selectable manner as shown in FIG. 5 on a display. Then, the selection unit 11 selects the measurement target specified by the user input on the UI screen. Details of each of the plurality of measurement objects shown in FIG. 5 will be described in the following embodiments.
  • the selection unit 11 may output a UI screen designating an operation to a display, and accept a user input from the UI screen.
  • the operation that can be specified on the UI screen may be an operation performed during a predetermined sport. Examples include, but are not limited to, a baseball throwing action, a baseball hitting action, a tennis serve action, a tennis forehand action, a tennis volley action, and the like.
  • the operation that can be specified on the UI screen may be an operation performed in daily life. For example, an operation of ascending a stair, an operation of lifting an object, and the like are exemplified, but are not limited thereto.
  • the selection unit 11 previously stores information in which each operation is associated with a measurement target associated with each operation. Then, the selection unit 11 selects measurement targets corresponding to the operation specified by the user input one by one in a predetermined order.
  • the calculation unit 12 calculates a measurement value indicating the movable range of the selected measurement target by analyzing the image of the subject by an analysis method corresponding to the selected measurement target.
  • the “image” is an image of the subject 100 performing a predetermined movement with the three-dimensional camera 200.
  • the processing apparatus 10 stores in advance an algorithm indicating a method of analyzing image data corresponding to each of a plurality of measurement targets.
  • the algorithm corresponding to each of the plurality of measurement targets specifies a predetermined part of the body (eg, elbow, shoulder), and is common in that a value indicating the range of motion is calculated based on the specified part.
  • the part to be specified and the value (eg, angle, distance, etc.) calculated based on the specified part are different from each other.
  • the calculation unit 12 analyzes the image using an algorithm corresponding to the measurement target selected by the selection unit 11 to calculate a measurement value indicating the movable range of the selected measurement target. Details of the algorithm will be described in the following embodiments.
  • the selection unit 11 selects a measurement object of the movable range based on a user input (S10).
  • the calculation unit 12 calculates the measurement value indicating the movable range of the selected measurement target by analyzing the image obtained by photographing the subject using an analysis method corresponding to the measurement target selected in S10 (S20). .
  • the processing device 10 of the present embodiment prepares an image data analysis method (algorithm) for each of a plurality of measurement targets having at least one of the evaluation site and the operation direction of the site different from each other, An image obtained by photographing the subject is analyzed by an analysis method corresponding to the measured object, and a measurement value indicating a movable range of the object is calculated. According to such a processing device 10, it is possible to evaluate the movable range of a plurality of parts of the body and the plurality of movement directions without special knowledge.
  • algorithm image data analysis method
  • the range of motion is calculated by the computer analyzing the image based on the predetermined algorithm. , The movable range is calculated, and high reproducibility is obtained.
  • an operation by a user input eg, a baseball throwing operation, a baseball hitting operation, a tennis serve operation, a tennis forehand operation, a tennis volley operation, and the like
  • a measurement object related to the specified operation is selected.
  • the selection of the measurement target to be evaluated is realized by simply performing the simple operation of designating the operation related to the user. . That is, the non-specialty becomes higher.
  • the processing apparatus 10 of the present embodiment is different from the first embodiment in that the method of calculating the measurement value indicating the movable range is more concretely performed. The details will be described below.
  • FIG. 3 An example of a functional block diagram of the processing device 10 is shown in FIG. 3 as in the first embodiment. As illustrated, the processing device 10 includes a selection unit 11 and a calculation unit 12. The configuration of the selection unit 11 is the same as in the first embodiment.
  • the calculation unit 12 analyzes the image of the subject by using an analysis method corresponding to the measurement target selected by the selection unit 11 to measure the range of motion of the selected measurement target. Calculate the value.
  • the calculation unit 12 specifies the subject on the image.
  • Means for identifying a person present in the image is not particularly limited, and can be realized using any technique.
  • the calculation unit 12 specifies the location of the body related to the selected measurement target on the image.
  • the location to specify is the head, neck, left and right hands, left and right thumbs, left and right wrists, left and right elbows, left and right shoulders, left and right shoulder middle, left and right hip joints, left and right hip joint middle, pelvic middle , The center of the spine, left and right knees, left and right ankles, left and right toes, etc., but are not limited thereto.
  • These locations can be identified by image analysis based on the appearance characteristics of the entire body, the appearance characteristics of each location, and the like.
  • the calculation unit 12 calculates a measurement value based on the specified body part.
  • the calculation unit 12 may calculate an angle between a line connecting the first point and the second point of the body and a reference line as a measurement value indicating a movable range.
  • the calculation unit 12 may calculate the distance between the third location and the fourth location of the body as a measurement value indicating the range of motion.
  • the calculation unit 12 calculates the fifth position of the body when the subject is in the first posture and the body of the body when the subject is in the second position different from the first posture at the same position.
  • the distance to the fifth point may be calculated as a measured value indicating the movable range.
  • information indicating the location of the body specified for each measurement target may be registered in the processing device 10 in advance. Then, the calculation unit 12 may grasp the location of the body to be specified based on the information and the measurement target selected by the selection unit 11. Further, the calculation unit 12 may grasp the value to be calculated based on the information shown in FIG.
  • each measurement target will be described in detail, and an example of a body part specified by each measurement target and an example of an image analysis method will be described.
  • the image analysis method is, for example, as follows. First, the calculation unit 12 specifies the shoulder and elbow on the side to be operated by image analysis (the positions indicated by the dots in FIGS. 8 and 9). Next, the calculation unit 12 determines a line lowered from the shoulder in the Y-axis direction as a reference line (a line indicated by S in FIG. 8), and determines a line connecting the shoulder and the elbow as a movement line. Note that the XYZ axis directions are as shown in FIGS. 1 and 8, but the direction connecting the three-dimensional camera 200 and the subject 100 is the z-axis direction, the vertical direction is the Y-axis direction, and the horizontal direction. Is the X-axis direction.
  • the calculation unit 12 calculates an angle formed between a line obtained by projecting the reference line on the YZ plane and a line obtained by projecting the movement line on the YZ plane as a measured value indicating the movable range of the flexion of the shoulder joint.
  • the calculation unit 12 can analyze, for example, a moving image showing a state in which the subject 100 performs the above-described series of operations. In this case, the calculation unit 12 can analyze the image for each frame and calculate the angle formed for each frame. Then, the calculation unit 12 can calculate the maximum value among the plurality of angles calculated for each frame as a measurement value indicating the movable range of the flexion of the shoulder joint.
  • the image analysis method described here is an example, and another method according to the image analysis method can be adopted.
  • the image analysis method is, for example, as follows. First, the calculation unit 12 specifies an elbow and a wrist on the side to be operated by image analysis (positions indicated by dots in FIGS. 10 and 11). Next, the calculation unit 12 determines a line extending from the elbow in the Z-axis direction (the front side of the body) as a reference line (a line indicated by S in FIG. 10), and a line connecting the elbow and the wrist as a movement line. Determine.
  • the calculation unit 12 calculates an angle formed by a line obtained by projecting the reference line on the YZ plane and a line obtained by projecting the movement line on the YZ plane as a measurement value indicating the movable range of the external rotation of the shoulder joint.
  • the calculation unit 12 can analyze, for example, a moving image showing a state in which the subject 100 performs the above-described series of operations. In this case, the calculation unit 12 can analyze the image for each frame and calculate the angle formed for each frame. Then, the calculation unit 12 can calculate the maximum value among the plurality of angles formed for each frame as a measurement value indicating the movable range of the external rotation of the shoulder joint.
  • the image analysis method described here is an example, and another method according to the image analysis method can be adopted.
  • the image analysis method is, for example, as follows. First, the calculation unit 12 specifies the elbow and the wrist on the side to be operated by image analysis (positions indicated by dots in FIGS. 12 and 13). Next, the calculation unit 12 determines a line extending from the elbow in the Z-axis direction (rear side of the body) as a reference line (a line indicated by S in FIG. 12), and a line connecting the elbow and the wrist as a movement line. Determine.
  • the calculation unit 12 calculates an angle formed by a line obtained by projecting the reference line on the YZ plane and a line obtained by projecting the movement line on the YZ plane as a measured value indicating the movable range of the internal rotation of the shoulder joint.
  • the calculation unit 12 can analyze, for example, a moving image showing a state in which the subject 100 performs the above-described series of operations. In this case, the calculation unit 12 can analyze the image for each frame and calculate the angle formed for each frame. Then, the calculation unit 12 can calculate the maximum value among the plurality of angles formed for each frame as a measurement value indicating the movable range of the internal rotation of the shoulder joint.
  • the image analysis method described here is an example, and another method according to the image analysis method can be adopted.
  • Hip joint flexion (“ hip flexion right “,” hip flexion left “in FIG. 5)) As shown in FIGS. 14 and 15, the hip flexion causes the subject 100 to face the three-dimensional camera 200 (not shown), and move his / her feet from a cautious posture to bring the knees closer to the chest.
  • the subject 100 operates the right leg with the hip flexion right, and the subject 100 operates the left leg with the hip flexion left.
  • the image analysis method is, for example, as follows. First, the calculation unit 12 specifies the hip joint and the knee on the side to be operated by image analysis (the positions indicated by the dots in FIGS. 14 and 15). Next, the calculation unit 12 determines a line lowered from the hip joint in the Y-axis direction as a reference line (a line indicated by S in FIG. 14), and determines a line connecting the hip joint and the knee as a movement line.
  • the calculation unit 12 calculates an angle between a line obtained by projecting the reference line on the YZ plane and a line obtained by projecting the movement line on the YZ plane as a measurement value indicating the movable range of the hip joint flexion.
  • the calculation unit 12 can analyze, for example, a moving image showing a state in which the subject 100 performs the above-described series of operations. In this case, the calculation unit 12 can analyze the image for each frame and calculate the angle formed for each frame. Then, the calculation unit 12 can calculate the maximum value of the plurality of angles formed for each frame as a measurement value indicating the movable range of hip joint flexion.
  • the image analysis method described here is an example, and another method according to the image analysis method can be adopted.
  • Torso rotation right In the torso rotation, as shown in FIGS. 16 and 17, the subject 100 faces the three-dimensional camera 200 (not shown) and turns his / her upper body in a sitting position with his / her hands folded with his / her head. At that time, the subject 100 shifts the weight in the rotation direction. At the trunk rotation right, the subject 100 rotates the upper body to the right, and at the trunk rotation left, the subject 100 rotates the upper body to the left.
  • the image analysis method is, for example, as follows. First, the calculation unit 12 specifies both hip joints and both shoulders by image analysis (positions indicated by dots in FIGS. 16 and 17). Next, the calculation unit 12 determines a line connecting both hip joints as a reference line (a line indicated by S in FIG. 16) and a line connecting both shoulders as a movement line.
  • the calculation unit 12 calculates an angle between a line obtained by projecting the reference line on the XZ plane and a line obtained by projecting the movement line on the XZ plane, as a measurement value indicating the movable range of the trunk external rotation.
  • the calculation unit 12 can analyze, for example, a moving image showing a state in which the subject 100 performs the above-described series of operations. In this case, the calculation unit 12 can analyze the image for each frame and calculate the angle formed for each frame. Then, the calculation unit 12 can calculate the maximum value among the plurality of angles formed for each frame as a measurement value indicating the movable range of the torso rotation.
  • the image analysis method described here is an example, and another method according to the image analysis method can be adopted.
  • the image analysis method is, for example, as follows. First, the calculation unit 12 specifies left and right toes by image analysis (positions indicated by dots in FIGS. 18 and 19). Then, the calculation unit 12 calculates the distance between the left and right toes in the X-axis direction as a measurement value indicating the movable range of the lower limb reach side.
  • the calculation unit 12 can analyze, for example, a moving image showing a state in which the subject 100 performs the above-described series of operations. In this case, the calculation unit 12 can analyze the image for each frame and calculate the distance for each frame. Then, the calculation unit 12 can calculate the maximum value among the plurality of distances calculated for each frame as a measurement value indicating the movable range of the lower limb reach side.
  • the image analysis method described here is an example, and another method according to the image analysis method can be adopted.
  • the image analysis method is, for example, as follows. First, the calculation unit 12 specifies left and right toes by image analysis (positions indicated by dots in FIGS. 20 and 21). Then, the calculation unit 12 calculates the distance between the left and right toes in the X-axis direction as a measurement value indicating the movable range behind the lower limb reach.
  • the calculation unit 12 can analyze, for example, a moving image showing a state in which the subject 100 performs the above-described series of operations. In this case, the calculation unit 12 can analyze the image for each frame and calculate the distance for each frame. Then, the calculation unit 12 can calculate the maximum value among the plurality of distances calculated for each frame as a measurement value indicating the range of motion behind the lower limb reach.
  • the image analysis method described here is an example, and another method according to the image analysis method can be adopted.
  • the image analysis method is, for example, as follows. First, the calculation unit 12 specifies a hand in the first posture and a hand in the second posture by image analysis (positions indicated by dots in FIGS. 22 and 23). Then, the calculation unit 12 calculates a distance in the X-axis direction between the hand in the first posture and the hand in the second posture as a measurement value indicating the movable range of the upper limb reach.
  • the calculation unit 12 can analyze, for example, a moving image showing a series of operations from the first posture to the second posture of the subject 100. In this case, the calculation unit 12 can analyze the image for each frame and calculate the distance for each frame. Then, the calculation unit 12 can calculate the maximum value among the plurality of distances calculated for each frame as a measurement value indicating the movable range of the upper limb reach.
  • the image analysis method described here is an example, and another method according to the image analysis method can be adopted.
  • T-shaped balance T-shaped balance left support "," T-shaped balance right support “in Fig. 5)
  • T-shaped balance left support the subject 100 stands on the left foot
  • T-shaped balance right support the subject 100 stands on the right foot.
  • the image analysis method is, for example, as follows. First, the calculation unit 12 specifies a shoulder, an elbow, a hip joint, and a knee by image analysis (positions indicated by dots in FIGS. 24 and 25). Next, the calculation unit 12 determines a line extending from the shoulder in the X-axis direction (a direction extending in a direction opposite to the arm) as a reference line (line indicated by S in FIG. 24) and calculates an angle between the reference line and the reference line. A line connecting the shoulder and the elbow is determined as a movement line for the movement. The calculation unit 12 determines a line extending from the hip joint in the X-axis direction (a direction extending in a direction opposite to the foot) as a reference line (a line indicated by S in FIG. 24), and calculates an angle between the reference line and the reference line. A line connecting the hip joint and the knee is determined as a movement line for performing the movement.
  • the calculation unit 12 calculates an angle between a line obtained by projecting the reference line on the XY plane and a line obtained by projecting the movement line on the XY plane as a measurement value indicating the movable range of the T-shaped balance.
  • the calculation unit 12 can analyze, for example, a moving image showing a state in which the subject 100 performs the above-described series of operations. In this case, the calculation unit 12 can analyze the image for each frame and calculate the angle formed for each frame. Then, the calculation unit 12 can calculate the maximum value among the plurality of angles formed for each frame as a measurement value indicating the movable range of the T-shaped balance.
  • the image analysis method described here is an example, and another method according to the image analysis method can be adopted.
  • the calculation unit 12 specifies the subject on the image (S21). Next, the calculation unit 12 specifies the location of the body related to the measurement target selected in S10 on the image (S22). Next, the calculation unit 12 calculates a measured value based on the location specified in S22 (S23).
  • the measurement value can be calculated based on the specified location. For example, the angle between the line connecting the two points and the reference line is calculated, the distance between the two points is calculated, or the second position is set at the same position as the position when the first position is taken. It is possible to calculate the distance to the same place when taking. According to such a processing device 10, it is possible to measure the movable range of various parts of the body in various operation directions.
  • the processing apparatus 10 of the present embodiment is different from the first and second embodiments in that the processing apparatus 10 has a function of evaluating a state of a measurement target based on a measurement value indicating a movable range. The details will be described below.
  • the processing device 10 includes a selection unit 11, a calculation unit 12, and an evaluation unit 13.
  • the configurations of the selection unit 11 and the calculation unit 12 are the same as those in the first and second embodiments.
  • the evaluation unit 13 evaluates the state of the measurement target based on the measurement value calculated by the calculation unit 12 and indicating the range of motion of the measurement target.
  • the evaluation unit 13 can execute at least one of the first to third evaluation methods described below.
  • the evaluation unit 13 calculates a difference between the measurement values of the two patterns and evaluates the state of the measurement target based on the difference. It can. The smaller the difference, the more appropriate the left / right balance is, and the larger the difference, the more the left / right balance is lost. According to the first evaluation method, it is possible to evaluate whether the right and left balance of the body is in an appropriate state.
  • “Calculating two patterns of measured values corresponding to the left and right sides of the body” is, for example, “calculate measured values of right shoulder flexion and left shoulder flexion”, “calculate measured values of right shoulder external rotation and left shoulder external rotation” , ⁇ Calculate measured values of shoulder rotation right and shoulder rotation left '', ⁇ Calculate measured values of crotch flexion right and crotch flexion left '', ⁇ Calculate measured values of trunk rotation right and trunk rotation left '', ⁇ Calculate the measured value of standing on the left leg and extending the right leg, and the measured value of standing on the right leg and extending the left leg on the side of the lower limb reach '', ⁇ The measured value of standing on the left leg and extending the right leg behind the reach of the lower limb, Calculate the measured value of standing and stretching the left foot, "Calculate the measured value of the upper limb reach using the right foot as the measuring foot, and the measured value of the left foot as the measuring foot,” "T-shaped balance left Calculate measured
  • the evaluation unit 13 may classify the difference between the measured values of the two patterns into a plurality of levels, and hold first state definition information defining the state of the measurement target for each level.
  • the first state definition information includes, for example, “good” for the difference D0 to D1, “slightly dangerous” for the difference D1 to D2, and “dangerous” for the difference D2 to D3. Then, the evaluation unit 13 may evaluate the state of each measurement target based on the first state definition information and the difference between the measured values of the two patterns.
  • the first state definition information may be defined for each measurement target.
  • the numerical range of the difference in each state such as “good”, “slightly dangerous”, and “dangerous” may be different for each measurement target.
  • the evaluation unit 13 can evaluate the state of the measurement target based on the comparison result between the measured value and the reference value.
  • the evaluation unit 13 may classify the measured values into a plurality of levels, and hold second state definition information that defines the state of the measurement target for each level.
  • the second state definition information includes, for example, “good” for measured values C0 to C1, “slightly dangerous” for measured values C1 to C2, and “dangerous” for measured values C2 to C3. Then, the evaluation unit 13 may evaluate the state of each measurement target based on the second state definition information and the measurement value.
  • the second state definition information may be defined for each measurement target. That is, the numerical value ranges of the measured values in each state such as “good”, “slightly dangerous”, and “dangerous” may be different for each measurement target.
  • the second state definition information may be defined for each attribute of the subject (for example, for each age and each sex). That is, the numerical value ranges of the measured values in each of the states such as “good”, “slightly dangerous”, and “dangerous” may be different for each attribute of the subject (eg, for each age, each gender).
  • the processing device 10 receives an input of a value specifying the attribute of the subject by any means, the evaluation unit 13 specifies the attribute of the subject based on the input value, and the second state corresponding to the specified attribute.
  • the state of the measurement target is evaluated using the definition information.
  • the evaluation unit 13 evaluates the state of the measurement target based on the measurement value (eg, the latest measurement value) of the subject and the past measurement value. According to the third evaluation method, it is possible to evaluate a change (deterioration, maintenance of the current state, improvement, etc.) of the state of the measurement target of the subject.
  • the evaluation unit 13 may hold third state definition information that defines the state of the measurement target based on past measurement values.
  • the third state definition information includes, for example, when the measured value is “past measured value minus ⁇ or more and past measured value plus ⁇ or less”, the current state is “maintained”, and the measured value is “past measured value minus less than ⁇ ”. In this case, the result is “deteriorated”, and when the measured value is “greater than the past measured value plus ⁇ ”, it is “improved”. Then, the evaluation unit 13 may evaluate the state of each measurement target based on the third state definition information, the past measurement value, and, for example, the latest measurement value.
  • the third state definition information may be defined for each measurement target. That is, the value of ⁇ may be different for each measurement target. Further, the third state definition information may be defined for each attribute of the subject (for example, for each age and each gender). That is, the value of ⁇ may be different for each attribute of the subject (eg, for each age, each gender).
  • the processing device 10 receives an input of a value specifying the attribute of the subject by any means, the evaluation unit 13 specifies the attribute of the subject based on the input value, and the third state corresponding to the specified attribute. The state of the measurement target is evaluated using the definition information.
  • the selection unit 11 selects a measurement object of the movable range based on a user input (S10).
  • the calculation unit 12 calculates the measurement value indicating the movable range of the selected measurement target by analyzing the image obtained by photographing the subject using an analysis method corresponding to the measurement target selected in S10 (S20).
  • the evaluation unit 13 evaluates the state of the measurement target based on the measurement value calculated in S20 (S30).
  • the state of the measurement target can be evaluated based on the measurement values.
  • the state of the measurement target can be evaluated based on the difference between the two patterns of measured values. In this case, it is possible to evaluate whether the right and left balance of the body is in an appropriate state.
  • the state of the measurement target can be evaluated based on the comparison result between the measurement value and the reference value. In this case, it is possible to evaluate whether the state of the measurement target is a general standard state.
  • the state of the measurement target can be evaluated based on the comparison result between the measurement value of the subject and the past measurement value. In this case, the change (deterioration, maintenance of the current state, improvement, etc.) of the state of the measurement target of the subject can be evaluated.
  • the state of the measurement target can be evaluated from various angles.
  • the processing apparatus 10 of the present embodiment is different from the first to third embodiments in that the processing apparatus 10 has a function of outputting information indicating a state of a measurement target. The details will be described below.
  • the processing device 10 includes a selection unit 11, a calculation unit 12, an evaluation unit 13, and an output unit 14.
  • the configurations of the selection unit 11, the calculation unit 12, and the evaluation unit 13 are the same as those of the first to third embodiments.
  • the output unit 14 outputs various information via the output device.
  • the output device includes, but is not limited to, a display, a projection device, a mailer, a printer, a speaker, a warning lamp, and the like.
  • the output unit 14 outputs, for example, information indicating the state of the measurement target evaluated by the evaluation unit 13. Then, when the state of the measurement target is the predetermined state, the output unit 14 outputs, in addition to the information indicating the state of the measurement target, information for improving the state of the measurement target or information for acquiring the information. Can be output.
  • the predetermined state "here is a state that requires improvement.
  • the predetermined state is, for example, “danger”, “slightly dangerous”, “deteriorated”, or the like.
  • Information for improving the condition of the measurement target indicates exercise for improving the condition.
  • the information may be a moving image in which a person or the like is performing the exercise, or may be a description of the exercise using photographs, illustrations, characters, and the like.
  • “Information for acquiring the information” is a storage location (eg, URL) of a data file containing the information, a guide for accessing the information, and the like.
  • the output unit 14 may output a measurement value indicating the movable range of the measurement target calculated by the calculation unit 12.
  • FIG. 30 shows an example of information output from the output unit 14.
  • the output unit 14 associates, for each measurement target, the measurement value (both left and right) indicating the movable range of the measurement target, the difference between the left and right measurement values, and the state evaluated by the evaluation unit 13. Output information.
  • the state evaluated by the evaluation unit 13 is indicated by a facial icon expression.
  • the output unit 14 associates the icon (“Do @ It!”, “Check”) with a link of the moving image in association with the measurement target whose evaluation state by the evaluation unit 13 is a predetermined state (a state requiring improvement). ) Icon is output.
  • FIG. 31 shows another example of the information output by the output unit 14.
  • the output unit 14 outputs, for each measurement object, the state evaluated by the evaluation unit 13, the difference between the left and right measurement values, and a bar graph indicating the difference in association with each other.
  • the state evaluated by the evaluation unit 13 is indicated by a facial icon expression.
  • the difference between the left and right measurements is the number below the face icon.
  • the output unit 14 associates the icon (“Do @ It!”, “Check”) with a link of the moving image in association with the measurement target whose evaluation state by the evaluation unit 13 is a predetermined state (a state requiring improvement). ) Icon is output.
  • FIG. 32 shows another example of the information output by the output unit 14.
  • the evaluation unit 13 evaluates the state of the measurement target in four stages from 0 to 3. 3 is the best state, and the state becomes worse as the numerical value becomes smaller.
  • the output unit 14 outputs a radar chart showing the state of each measurement target and the entire state.
  • FIG. 33 shows another example of the information output by the output unit 14.
  • the output unit 14 outputs information indicating a time-series change of a measurement value of a certain measurement object of a certain subject 100.
  • the output unit 14 may output a moving image of the subject 100 in real time while the subject 100 is performing a predetermined operation for measuring the range of motion. Then, the output unit 14 may output various information on the screen.
  • a moving image of the subject 100 is displayed in real time in the area (1).
  • information indicating the measurement target selected by the selection unit 11 is displayed.
  • the maximum value among the measured values calculated based on each frame during the measurement is displayed.
  • the calculation unit 12 analyzes the image, and the output unit 14 outputs the analysis result (measured value) in real time.
  • buttons for re-measurement there are other buttons for re-measurement, and for forcibly updating the maximum value displayed in the area (3) with the measured value displayed in the area (3) at that time. (Buttons with stars) are displayed.
  • various information such as a measured value indicating the movable range of the measurement target, a difference between left and right measurement values, and a state of the measurement target can be output for each measurement target.
  • the subject 100 can easily grasp the state of the body part of the subject by browsing the information.
  • the processing device 10 of the present embodiment when there is a measurement target whose state is not good, information for improving the state of the measurement target and information for acquiring the information can be output. .
  • the processing device 10 can provide information desired by the subject 100 at an appropriate timing.
  • the processing device 10 of the present embodiment has a function of detecting that the subject has performed a prohibition operation that affects the accuracy of the measurement while performing a predetermined operation for measuring the movable range of the measurement target. This is different from the first to fourth embodiments. The details will be described below.
  • the processing device 10 includes a selection unit 11, a calculation unit 12, an evaluation unit 13, an output unit 14, and a prohibited operation detection unit 15.
  • the configurations of the selection unit 11, the calculation unit 12, the evaluation unit 13, and the output unit 14 are the same as those of the first to fourth embodiments.
  • the prohibited operation detection unit 15 analyzes an image of the subject 100 and detects that the subject has performed a prohibited operation.
  • the prohibited operation detecting unit 15 specifies a plurality of locations of the subject's body related to the measurement target selected by the selecting unit 11 on the image, and detects the prohibited operation based on the specified plurality of locations. For example, the prohibition operation detection unit 15 may determine that the prohibition operation has been performed when the angle between the line connecting the first and second parts of the body and the reference line deviates from the reference range. In addition, the prohibited operation detection unit 15 may determine that the prohibited operation has been performed when the distance between the third location and the fourth location of the body deviates from the reference range.
  • information indicating the location of the body specified for each measurement target may be registered in the processing device 10 in advance. Then, the prohibited motion detection unit 15 may grasp the location of the body to be specified based on the information and the measurement target selected by the selection unit 11. Further, the prohibited operation detection unit 15 may grasp the content to be detected based on the information shown in FIG.
  • the calculation unit 12 also specifies a plurality of locations of the subject's body related to the measurement target selected by the selection unit 11 on the image, and performs a predetermined process based on the identified plurality of locations. I do.
  • the body part specified by the calculation unit 12 and the body part specified by the prohibited motion detection unit 15 may be different from each other (a state in which they do not completely match). That is, the plurality of parts of the body specified by the calculation unit 12 in relation to the first measurement target may be different from the plurality of parts of the body specified by the prohibited operation detection unit 15 in relation to the first measurement target (completely). Does not match).
  • the prohibition operation detection unit 15 detects a line connecting the head and the center of the left and right shoulders, a line connecting the center of the left and right shoulders and the center of the pelvis, etc. from the Y axis in the YZ plane.
  • the prohibition operation of (1) may be detected by detecting the inclination of a predetermined angle or more.
  • the prohibition operation detection unit 15 detects the prohibition operation of (2) by detecting that the line connecting both shoulders is inclined at a predetermined angle or more from the X axis in the XZ plane. You may.
  • Prohibition operation of (3) For example, the prohibition operation detection unit 15 detects that the line connecting the shoulder and the elbow of the operating arm is tilted by a predetermined angle or more from the Y axis in the XY plane, thereby obtaining (3). ) May be detected.
  • the prohibition operation detection unit 15 detects a line connecting the head and the center of the left and right shoulders, a line connecting the center of the left and right shoulders and the center of the pelvis, etc. from the Y axis in the YZ plane.
  • the prohibition operation of (1) may be detected by detecting the inclination of a predetermined angle or more.
  • the prohibition operation detection unit 15 detects the prohibition operation of (2) by detecting that the line connecting both shoulders is inclined at a predetermined angle or more from the X axis in the XZ plane. You may.
  • Prohibition operation of (3) For example, the prohibition operation detection unit 15 detects that the line connecting the elbow and the hand of the operating arm is inclined at a predetermined angle or more from the Y axis in the XY plane, thereby obtaining (3). ) May be detected.
  • the prohibition operation detection unit 15 projects the line connecting the shoulder and the elbow of the operating arm to the XY plane and the line connecting the elbow and the hand to the XY plane. By detecting that the angle formed by the angle .theta. Deviates from the range of 90.degree. Minus .beta. To 90.degree. Plus .beta., The inhibition operation of (4) may be detected.
  • the prohibition operation detection unit 15 detects a line connecting the head and the center of the left and right shoulders, a line connecting the center of the left and right shoulders and the center of the left and right hip joints, etc. in the YZ plane.
  • the prohibition operation of (1) may be detected by detecting the inclination from the Y axis by a predetermined angle or more.
  • the prohibition operation detection unit 15 detects the prohibition operation of (1) by detecting that the line connecting both shoulders is inclined at a predetermined angle or more from the X axis in the XY plane. You may.
  • the prohibition operation detection unit 15 calculates the distance L between both toes, calculates the distance M in the X-axis direction between the center of the pelvis and the toe on the support side, and then M ⁇
  • the prohibition operation (1) may be detected by detecting that L / 3 is satisfied.
  • the prohibition operation detection unit 15 calculates the distance L between both toes, calculates the distance M in the X-axis direction between the center of the pelvis and the toe on the support side, and then M ⁇
  • the prohibition operation (1) may be detected by detecting that L / 3 is satisfied.
  • T-shaped balance In the T-shaped balance, (1) an operation in which the shoulder is on the three-dimensional camera 200 side from the elbow and a knee is on the three-dimensional camera 200 side from the hip joint is a prohibited operation.
  • a method of detecting the prohibited operation by image analysis will be described, but the method is not limited to this.
  • Prohibition operation (1) For example, the prohibition operation detection unit 15 determines that the line connecting the shoulder and the elbow is inclined at a predetermined angle or more from the X axis in the XZ plane, or the line connecting the knee and the hip joint is in the XZ plane.
  • the prohibition operation of (1) may be detected by detecting the inclination from the X axis by a predetermined angle or more.
  • the output unit 14 may output a warning to that effect.
  • the output unit 14 when the output unit 14 outputs a moving image of the subject 100 in real time while the subject 100 is performing a predetermined operation for measuring the range of motion, a prohibited operation is detected.
  • the unit 15 may analyze the image in real time and detect the prohibited operation. Then, when the prohibited operation detecting unit 15 detects the prohibited operation, the output unit 14 may display a warning to that effect on the screen shown in FIG. 34 in real time.
  • the output unit 14 may output a warning through a speaker, a warning lamp, or the like.
  • the calculation unit 12 may use the detection result of the prohibited operation detection unit 15 to calculate the measurement value indicating the movable range. As described in the second embodiment, the calculation unit 12 analyzes a moving image showing a state in which the subject 100 performs a predetermined series of operations, calculates an angle and a distance made for each frame, and calculates the angle and the distance for each frame. The maximum value among a plurality of angles and distances can be calculated as a measurement value indicating the movable range. In this case, the calculation unit 12 may calculate a maximum value among a plurality of angles and distances calculated from each frame when no prohibited operation is detected as a measurement value indicating the movable range.
  • the calculation unit 12 ignores the angle and distance calculated based on the frame when the prohibited operation is detected, and ignores the angle and distance calculated based on the frame when the prohibited operation is not detected. Based on this, a measurement value indicating the movable range may be calculated.
  • the processing device 10 of the present embodiment it is detected that the subject has performed the prohibition operation that affects the accuracy of the measurement while performing the predetermined operation for measuring the movable range of the measurement target. it can. According to the processing apparatus 10 of the present embodiment, the reliability of the measurement result of the movable range is increased.
  • the processing apparatus 10 of the present embodiment is different from the first to fifth embodiments in that the processing apparatus 10 has a function of reducing the problem that the measured value of the movable range increases by performing an operation with a recoil. The details will be described below.
  • the processing device 10 includes a selection unit 11 and a calculation unit 12, and can further include at least one of an evaluation unit 13, an output unit 14, and a prohibited operation detection unit 15.
  • the configurations of the selection unit 11, the evaluation unit 13, the output unit 14, and the prohibited operation detection unit 15 are the same as those in the first to fifth embodiments.
  • the calculation unit 12 analyzes a moving image showing a state in which the subject 100 performs a predetermined series of operations, and calculates a value (an angle or a distance) indicating a movable range of a measurement target for each frame. ) Is calculated. Then, the calculating unit 12 calculates, as a value indicating the movable range at a certain moment, a statistical value of a value (an angle or a distance) indicating the movable range of the measurement target calculated based on the frame at the moment and the M frames immediately before the frame. (Example: average value, mode value, median value, etc.) are calculated.
  • the calculating unit 12 calculates the value indicating the movable range based on the frame at the moment and the M frames immediately before the moment at each of the plurality of moments included in the moving image. Next, the calculation unit 12 calculates the maximum value among the values indicating the movable range calculated corresponding to each of the plurality of instants as a measurement value indicating the movable range of the measurement target.
  • calculation unit 12 Other configurations of the calculation unit 12 are the same as those of the first to fifth embodiments.
  • the problem that the measured value of the movable range is increased by performing the operation with the recoil can be reduced.
  • the values (angles and distances) to be calculated instantaneously may be large, but the state is not maintained. For this reason, by using the statistical value of the value calculated from each of a plurality of continuous frames as the measured value, it is possible to ignore the value that has increased instantaneously or reduce its influence.
  • Selecting means for selecting a measurement object of the range of motion A calculation unit that calculates a measurement value indicating a movable range of the selected measurement target by analyzing an image obtained by capturing the subject by an analysis method corresponding to the selected measurement target, A processing device having: 2. In the processing device according to 1, The calculating means, Identifying the subject on the image; Identifying a location related to the selected measurement target on the image, A processing device that calculates the measurement value based on the specified location. 3. 2. The processing apparatus according to item 2, The calculating means, A processing device for calculating, as the measured value, an angle between a line connecting the identified first location and the second location and a reference line. 4.
  • the calculating means A processing device that calculates a distance between the specified third location and the fourth location as the measured value. 5. In the processing apparatus according to any one of 2 to 4, The calculating means, The distance between the fifth position specified when the subject is in the first position and the fifth position specified when the subject is in the second position different from the first position is , A processing device for calculating as the measured value. 6. In the processing apparatus according to any one of 1 to 5, The calculating means, Analyzing the moving image to calculate the measurement value, A processing device that calculates a value indicating the selected movable range of the measurement target based on each of a plurality of frame images, and calculates a statistical value of the calculated plurality of values as the measurement value. 7. 7.
  • a processing apparatus further comprising an evaluation unit that evaluates a state of the measurement target based on the measurement value.
  • a processing device further comprising an output unit that outputs information indicating a state of the measurement target evaluated by the evaluation unit. 12. 11. The processing apparatus according to item 11, A processing device that, when the state of the measurement target is a predetermined state, outputs information for improving the state of the measurement target or information for acquiring the information. 13. 13. The processing apparatus according to any one of 1 to 12, A processing device further comprising a prohibited operation detecting unit that analyzes the image and detects that the subject has performed a prohibited operation. 14. 13. The processing apparatus according to item 13, The prohibited operation is registered in advance for each of the measurement targets, The processing device for detecting the prohibited operation related to the selected measurement target. 15.
  • the calculating means specifies a plurality of locations corresponding to the selected measurement target on the image, calculates the measurement value based on the identified plurality of locations
  • the prohibited operation detecting means specifies a plurality of locations corresponding to the selected measurement target on the image, and detects the prohibited operation based on the specified plurality of locations
  • a processing device different from a plurality of locations specified by the calculation means corresponding to the first measurement target and a plurality of locations specified by the prohibited operation detection means corresponding to the first measurement target. 16.
  • Computer A selection step of selecting a measurement object of the range of motion A calculation step of calculating a measurement value indicating a movable range of the selected measurement target, by analyzing an image obtained by capturing the subject by an analysis method corresponding to the selected measurement target, Processing method to execute. 17.
  • Computer Selection means for selecting the measurement object of the range of motion Calculation means for calculating a measurement value indicating a movable range of the selected measurement target by analyzing an image obtained by capturing the subject by an analysis method corresponding to the selected measurement target, Program to function as.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Physiology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biophysics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Image Analysis (AREA)

Abstract

The present invention provides a processing device (10) comprising: a selection unit (11) that selects a measurement target for a range of motion; and a calculation unit (12) that analyzes a captured image of a subject with an analysis method corresponding to the selected measurement target, thereby calculating a measurement value indicating the range of motion for the selected measurement target.

Description

処理装置、処理方法及びプログラムProcessing device, processing method and program
 本発明は、処理装置、処理方法及びプログラムに関する。 The present invention relates to a processing device, a processing method, and a program.
 ケガ予防、故障部位の回復具合の観察等の目的で、身体の所定部位の可動域を評価するメディカルチェックが行われる。このメディカルチェックを、専門的な知識なしで行えるようにすることが望まれている。メディカルチェックに関連する技術が特許文献1及び2に開示されている。 メ デ ィ A medical check is performed to evaluate the range of motion of a predetermined part of the body for the purpose of preventing injury, observing the degree of recovery of a failed part, and the like. It is desired that this medical check can be performed without specialized knowledge. Patent Documents 1 and 2 disclose techniques related to medical checks.
 特許文献1には、被験者が上肢運動を行うと、位置計測装置が被験者の手の先端の位置を特定し、その特定結果に基づき手部可動域測定手段が手部可動域を算出することが開示されている。 Patent Document 1 discloses that when a subject performs upper limb exercise, a position measuring device specifies the position of the tip of the subject's hand, and the hand movable range measuring means calculates the hand movable range based on the result of the specification. It has been disclosed.
 特許文献2には、被験者が動作中に痛み又は違和感を覚えたことを検出すると、その時の被験者の姿勢に関する情報を記憶する装置が開示されている。 Patent Document 2 discloses a device that stores information on the posture of a subject when the subject detects pain or discomfort during the operation.
特開2003-52769号公報JP-A-2003-52770 特開2015-97004号公報JP-A-2005-97004
 特許文献2に記載の技術は、被験者が動作中に痛み又は違和感を覚えた時の被験者の姿勢に関する情報を記憶するだけであり、被験者の所定部位の可動域を測定する技術ではない。 The technique described in Patent Document 2 merely stores information on the posture of the subject when the subject feels pain or discomfort during operation, and is not a technique for measuring the movable range of a predetermined part of the subject.
 特許文献1に記載の技術は、可動域を評価できる部位及びその動作方向が限定される。すなわち、身体の複数の部位や複数の動作方向の可動域を測定することはできない。 (4) In the technique described in Patent Document 1, a portion in which a movable range can be evaluated and an operation direction thereof are limited. That is, it is impossible to measure the movable range of a plurality of parts of the body or the plurality of movement directions.
 本発明は、身体の複数の部位や複数の動作方向の可動域を、専門的な知識なしで測定できるようにすることを課題とする。 The object of the present invention is to make it possible to measure the range of motion of a plurality of parts of the body and the plurality of movement directions without special knowledge.
 本発明によれば、
 可動域の測定対象を選択する選択手段と、
 被験者を撮影した画像を、選択された前記測定対象に対応した解析方法で解析することで、選択された前記測定対象の可動域を示す測定値を算出する算出手段と、
を有する処理装置が提供される。
According to the present invention,
Selecting means for selecting a measurement object of the range of motion,
A calculation unit that calculates a measurement value indicating a movable range of the selected measurement target by analyzing an image obtained by capturing the subject by an analysis method corresponding to the selected measurement target,
Is provided.
 また、本発明によれば、
 コンピュータが、
 可動域の測定対象を選択する選択工程と、
 被験者を撮影した画像を、選択された前記測定対象に対応した解析方法で解析することで、選択された前記測定対象の可動域を示す測定値を算出する算出工程と、
を実行する処理方法が提供される。
According to the present invention,
Computer
A selection step of selecting a measurement object of the range of motion,
A calculation step of calculating a measurement value indicating a movable range of the selected measurement target, by analyzing an image obtained by capturing the subject by an analysis method corresponding to the selected measurement target,
Is provided.
 また、本発明によれば、
 コンピュータを、
 可動域の測定対象を選択する選択手段、
 被験者を撮影した画像を、選択された前記測定対象に対応した解析方法で解析することで、選択された前記測定対象の可動域を示す測定値を算出する算出手段、
として機能させるプログラムが提供される。
According to the present invention,
Computer
Selection means for selecting the measurement object of the range of motion,
Calculation means for calculating a measurement value indicating a movable range of the selected measurement target by analyzing an image obtained by capturing the subject by an analysis method corresponding to the selected measurement target,
A program to function as a program is provided.
 本発明によれば、身体の複数の部位や複数の動作方向の可動域を、専門的な知識なしで評価できるようになる。 According to the present invention, it is possible to evaluate the range of movement of a plurality of parts of the body and the plurality of movement directions without special knowledge.
 上述した目的、およびその他の目的、特徴および利点は、以下に述べる好適な実施の形態、およびそれに付随する以下の図面によってさらに明らかになる。 The above and other objects, features and advantages will become more apparent from the preferred embodiments described below and the accompanying drawings.
本実施形態の処理装置10の利用方法の一例を説明するための図である。FIG. 5 is a diagram for explaining an example of a method of using the processing device 10 of the present embodiment. 本実施形態の処理装置10のハードウエア構成の一例を示す図である。FIG. 2 is a diagram illustrating an example of a hardware configuration of a processing apparatus 10 according to the embodiment. 本実施形態の処理装置10の機能ブロック図の一例を示す図である。FIG. 2 is a diagram illustrating an example of a functional block diagram of a processing device 10 according to the embodiment. 本実施形態の処理装置10が出力する画面の一例を示す図である。FIG. 4 is a diagram illustrating an example of a screen output by the processing device 10 of the embodiment. 本実施形態の処理装置10が出力する画面の一例を示す図である。FIG. 4 is a diagram illustrating an example of a screen output by the processing device 10 of the embodiment. 本実施形態の処理装置10の処理の流れの一例を示すフローチャートである。5 is a flowchart illustrating an example of a processing flow of the processing device 10 of the present embodiment. 本実施形態の処理装置10が記憶している情報の一例を模式的に示す図である。FIG. 4 is a diagram schematically illustrating an example of information stored in a processing device 10 according to the embodiment. 可動域の測定対象の一例を説明するための図である。It is a figure for explaining an example of a measuring object of a movable range. 可動域の測定対象の一例を説明するための図である。It is a figure for explaining an example of a measuring object of a movable range. 可動域の測定対象の一例を説明するための図である。It is a figure for explaining an example of a measuring object of a movable range. 可動域の測定対象の一例を説明するための図である。It is a figure for explaining an example of a measuring object of a movable range. 可動域の測定対象の一例を説明するための図である。It is a figure for explaining an example of a measuring object of a movable range. 可動域の測定対象の一例を説明するための図である。It is a figure for explaining an example of a measuring object of a movable range. 可動域の測定対象の一例を説明するための図である。It is a figure for explaining an example of a measuring object of a movable range. 可動域の測定対象の一例を説明するための図である。It is a figure for explaining an example of a measuring object of a movable range. 可動域の測定対象の一例を説明するための図である。It is a figure for explaining an example of a measuring object of a movable range. 可動域の測定対象の一例を説明するための図である。It is a figure for explaining an example of a measuring object of a movable range. 可動域の測定対象の一例を説明するための図である。It is a figure for explaining an example of a measuring object of a movable range. 可動域の測定対象の一例を説明するための図である。It is a figure for explaining an example of a measuring object of a movable range. 可動域の測定対象の一例を説明するための図である。It is a figure for explaining an example of a measuring object of a movable range. 可動域の測定対象の一例を説明するための図である。It is a figure for explaining an example of a measuring object of a movable range. 可動域の測定対象の一例を説明するための図である。It is a figure for explaining an example of a measuring object of a movable range. 可動域の測定対象の一例を説明するための図である。It is a figure for explaining an example of a measuring object of a movable range. 可動域の測定対象の一例を説明するための図である。It is a figure for explaining an example of a measuring object of a movable range. 可動域の測定対象の一例を説明するための図である。It is a figure for explaining an example of a measuring object of a movable range. 本実施形態の処理装置10の処理の流れの一例を示すフローチャートである。5 is a flowchart illustrating an example of a processing flow of the processing device 10 of the present embodiment. 本実施形態の処理装置10の機能ブロック図の一例を示す図である。FIG. 2 is a diagram illustrating an example of a functional block diagram of a processing device 10 according to the embodiment. 本実施形態の処理装置10の処理の流れの一例を示すフローチャートである。5 is a flowchart illustrating an example of a processing flow of the processing device 10 of the present embodiment. 本実施形態の処理装置10の機能ブロック図の一例を示す図である。FIG. 2 is a diagram illustrating an example of a functional block diagram of a processing device 10 according to the embodiment. 本実施形態の処理装置10が出力する画面の一例を示す図である。FIG. 4 is a diagram illustrating an example of a screen output by the processing device 10 of the embodiment. 本実施形態の処理装置10が出力する画面の一例を示す図である。FIG. 4 is a diagram illustrating an example of a screen output by the processing device 10 of the embodiment. 本実施形態の処理装置10が出力する画面の一例を示す図である。FIG. 4 is a diagram illustrating an example of a screen output by the processing device 10 of the embodiment. 本実施形態の処理装置10が出力する画面の一例を示す図である。FIG. 4 is a diagram illustrating an example of a screen output by the processing device 10 of the embodiment. 本実施形態の処理装置10が出力する画面の一例を示す図である。FIG. 4 is a diagram illustrating an example of a screen output by the processing device 10 of the embodiment. 本実施形態の処理装置10の機能ブロック図の一例を示す図である。FIG. 2 is a diagram illustrating an example of a functional block diagram of a processing device 10 according to the embodiment. 本実施形態の処理装置10が記憶している情報の一例を模式的に示す図である。FIG. 4 is a diagram schematically illustrating an example of information stored in a processing device 10 according to the embodiment. 禁止動作の一例を説明するための図である。FIG. 9 is a diagram for describing an example of a prohibition operation. 禁止動作の一例を説明するための図である。FIG. 9 is a diagram for describing an example of a prohibition operation. 禁止動作の一例を説明するための図である。FIG. 9 is a diagram for describing an example of a prohibition operation. 禁止動作の一例を説明するための図である。FIG. 9 is a diagram for describing an example of a prohibition operation. 禁止動作の一例を説明するための図である。FIG. 9 is a diagram for describing an example of a prohibition operation. 禁止動作の一例を説明するための図である。FIG. 9 is a diagram for describing an example of a prohibition operation.
<第1の実施形態>
 まず、本実施形態の処理装置の概要を説明する。本実施形態では、図1に示すように、被験者100が所定の動きを行う様子を3次元カメラ200で撮影する。そして、3次元カメラ200で生成された画像データが処理装置に入力される。3次元カメラ200は、画素毎に、色情報に加えて、距離情報を取得することができる。3次元カメラ200は、例えばkinectである。
<First embodiment>
First, an outline of the processing apparatus of the present embodiment will be described. In the present embodiment, as shown in FIG. Then, the image data generated by the three-dimensional camera 200 is input to the processing device. The three-dimensional camera 200 can acquire distance information in addition to color information for each pixel. The three-dimensional camera 200 is, for example, a kinect.
 処理装置は、複数の測定対象の可動域を測定することができる。複数の測定対象は、評価部位及びその部位の動作方向の少なくとも一方が互いに異なる。複数の測定対象の一例は、以下の実施形態で詳細に説明する。 The processing device can measure the movable range of a plurality of measurement targets. The plurality of measurement targets differ from each other in at least one of the evaluation site and the operation direction of the site. An example of the plurality of measurement objects will be described in detail in the following embodiments.
 予め、測定対象毎に画像データの解析方法が用意され、そのアルゴリズムが処理装置に記憶されている。複数の測定対象各々に対応したアルゴリズムは、身体の所定の部位(例:肘、肩)を特定し、特定した部位に基づき可動域を示す値を算出する点で共通するが、特定する部位や特定した部位に基づき算出する値(例:角度、距離等)等が互いに異なる。アルゴリズムの一例は、以下の実施形態で説明する。 画像 A method for analyzing image data is prepared in advance for each measurement object, and the algorithm is stored in the processing device. The algorithm corresponding to each of the plurality of measurement targets specifies a predetermined part of the body (eg, elbow, shoulder), and is common in that a value indicating the range of motion is calculated based on the specified part. Values (eg, angles, distances, etc.) calculated based on the specified parts are different from each other. An example of the algorithm will be described in the following embodiment.
 処理装置は、ユーザ入力に基づき可動域の測定対象を選択すると、選択した測定対象に対応した解析方法で入力された画像データを解析し、選択した測定対象の可動域を示す測定値を算出する。 When selecting the measurement target of the movable range based on the user input, the processing device analyzes the input image data by an analysis method corresponding to the selected measurement target, and calculates a measurement value indicating the movable range of the selected measurement target. .
 以下、処理装置の構成を詳細に説明する。まず、処理装置のハードウエア構成の一例について説明する。本実施形態の処理装置が備える各機能部は、任意のコンピュータのCPU(Central Processing Unit)、メモリ、メモリにロードされるプログラム、そのプログラムを格納するハードディスク等の記憶ユニット(あらかじめ装置を出荷する段階から格納されているプログラムのほか、CD(Compact Disc)等の記憶媒体やインターネット上のサーバ等からダウンロードされたプログラムをも格納できる)、ネットワーク接続用インターフェイスを中心にハードウエアとソフトウエアの任意の組合せによって実現される。そして、その実現方法、装置にはいろいろな変形例があることは、当業者には理解されるところである。 構成 Hereinafter, the configuration of the processing apparatus will be described in detail. First, an example of a hardware configuration of the processing device will be described. The functional units included in the processing apparatus of the present embodiment include a CPU (Central Processing Unit) of an arbitrary computer, a memory, a program to be loaded into the memory, and a storage unit such as a hard disk for storing the program (a stage in which the apparatus is shipped in advance). In addition to programs stored from the Internet, it can also store storage media such as CDs (Compact Discs) and programs downloaded from servers on the Internet.) It is realized by a combination. It will be understood by those skilled in the art that there are various modifications in the method and apparatus for realizing the method.
 図2は、本実施形態の処理装置のハードウエア構成を例示するブロック図である。図2に示すように、処理装置は、プロセッサ1A、メモリ2A、入出力インターフェイス3A、周辺回路4A、バス5Aを有する。周辺回路4Aには、様々なモジュールが含まれる。処理装置は周辺回路4Aを有さなくてもよい。なお、処理装置は物理的に分かれた複数の装置で構成されてもよい。この場合、各装置が上記ハードウエア構成を備えることができる。 FIG. 2 is a block diagram illustrating a hardware configuration of the processing apparatus according to the present embodiment. As shown in FIG. 2, the processing device has a processor 1A, a memory 2A, an input / output interface 3A, a peripheral circuit 4A, and a bus 5A. The peripheral circuit 4A includes various modules. The processing device may not have the peripheral circuit 4A. The processing device may be constituted by a plurality of physically separated devices. In this case, each device can have the above hardware configuration.
 バス5Aは、プロセッサ1A、メモリ2A、周辺回路4A及び入出力インターフェイス3Aが相互にデータを送受信するためのデータ伝送路である。プロセッサ1Aは、例えばCPU、GPU(Graphics Processing Unit)などの演算処理装置である。メモリ2Aは、例えばRAM(Random Access Memory)やROM(Read Only Memory)などのメモリである。入出力インターフェイス3Aは、入力装置、外部装置、外部サーバ、外部センサ(3次元カメラ)等から情報を取得するためのインターフェイスや、出力装置、外部装置、外部サーバ等に情報を出力するためのインターフェイスなどを含む。入力装置は、例えばキーボード、マウス、マイク等である。出力装置は、例えばディスプレイ、スピーカ、プリンター、メーラ等である。プロセッサ1Aは、各モジュールに指令を出し、それらの演算結果をもとに演算を行うことができる。 The bus 5A is a data transmission path through which the processor 1A, the memory 2A, the peripheral circuit 4A, and the input / output interface 3A mutually transmit and receive data. The processor 1A is an arithmetic processing device such as a CPU and a GPU (Graphics Processing Unit). The memory 2A is a memory such as a RAM (Random Access Memory) or a ROM (Read Only Memory). The input / output interface 3A is an interface for acquiring information from an input device, an external device, an external server, an external sensor (three-dimensional camera), or an interface for outputting information to an output device, an external device, an external server, or the like. Including. The input device is, for example, a keyboard, a mouse, a microphone, and the like. The output device is, for example, a display, a speaker, a printer, a mailer, or the like. The processor 1A can issue a command to each module and perform a calculation based on the calculation results.
 次に、処理装置の機能構成を説明する。図3に、処理装置10の機能ブロック図の一例を示す。図示するように、処理装置10は、選択部11と、算出部12とを有する。 Next, the functional configuration of the processing device will be described. FIG. 3 shows an example of a functional block diagram of the processing device 10. As illustrated, the processing device 10 includes a selection unit 11 and a calculation unit 12.
 選択部11は、ユーザ入力に基づき、可動域の測定対象を選択する。例えば、選択部11は、測定対象を選択するUI(user interface)画面をディスプレイに出力し、当該UI画面からユーザ入力を受付ける。 The selection unit 11 selects a measurement object of the movable range based on a user input. For example, the selection unit 11 outputs a UI (user @ interface) screen for selecting a measurement target to a display, and receives a user input from the UI screen.
 一例として、選択部11は、図4に示すように、「全項目測定」と「個別測定」とが選択可能なUI画面を出力してもよい。 As an example, as shown in FIG. 4, the selection unit 11 may output a UI screen from which “all item measurement” and “individual measurement” can be selected.
 ユーザ入力で「全項目測定」が指定された場合、選択部11は、全ての測定対象を所定順に1つずつ選択する。 (4) When “all item measurement” is specified by the user input, the selection unit 11 selects all the measurement targets one by one in a predetermined order.
 一方、ユーザ入力で「個別測定」が指定された場合、選択部11は、図5に示すように複数の測定対象を選択可能に一覧表示したUI画面をディスプレイに出力する。そして、選択部11は、当該UI画面においてユーザ入力で指定された測定対象を選択する。図5に示す複数の測定対象各々の詳細は以下の実施形態で説明する。 On the other hand, when “individual measurement” is designated by a user input, the selection unit 11 outputs a UI screen that lists a plurality of measurement targets in a selectable manner as shown in FIG. 5 on a display. Then, the selection unit 11 selects the measurement target specified by the user input on the UI screen. Details of each of the plurality of measurement objects shown in FIG. 5 will be described in the following embodiments.
 その他、選択部11は、動作を指定するUI画面をディスプレイに出力し、当該UI画面からユーザ入力を受付けてもよい。UI画面で指定可能な動作は、所定のスポーツ時に行う動作であってもよい。例えば、野球の投げる動作、野球の打つ動作、テニスのサーブの動作、テニスのフォアハンドの動作、テニスのボレーの動作等が例示されるが、これらに限定されない。その他、UI画面で指定可能な動作は、日常生活で行う動作であってもよい。例えば、階段を上がる動作、物を持ち上げる動作等が例示されるが、これらに限定されない。この場合、選択部11は、予め、各動作に、各動作に関連する測定対象を対応付けた情報を記憶しておく。そして、選択部11は、ユーザ入力で指定された動作に対応する測定対象を、所定順に1つずつ選択する。 In addition, the selection unit 11 may output a UI screen designating an operation to a display, and accept a user input from the UI screen. The operation that can be specified on the UI screen may be an operation performed during a predetermined sport. Examples include, but are not limited to, a baseball throwing action, a baseball hitting action, a tennis serve action, a tennis forehand action, a tennis volley action, and the like. In addition, the operation that can be specified on the UI screen may be an operation performed in daily life. For example, an operation of ascending a stair, an operation of lifting an object, and the like are exemplified, but are not limited thereto. In this case, the selection unit 11 previously stores information in which each operation is associated with a measurement target associated with each operation. Then, the selection unit 11 selects measurement targets corresponding to the operation specified by the user input one by one in a predetermined order.
 算出部12は、被験者を撮影した画像を、選択された測定対象に対応した解析方法で解析することで、選択された測定対象の可動域を示す測定値を算出する。「画像」は、被験者100が所定の動きを行う様子を3次元カメラ200で撮影した画像である。 The calculation unit 12 calculates a measurement value indicating the movable range of the selected measurement target by analyzing the image of the subject by an analysis method corresponding to the selected measurement target. The “image” is an image of the subject 100 performing a predetermined movement with the three-dimensional camera 200.
 処理装置10には、予め、複数の測定対象各々に対応した画像データの解析方法を示すアルゴリズムが記憶されている。上述の通り、複数の測定対象各々に対応したアルゴリズムは、身体の所定の部位(例:肘、肩)を特定し、特定した部位に基づき可動域を示す値を算出する点で共通するが、特定する部位や特定した部位に基づき算出する値(例:角度、距離等)等が互いに異なる。算出部12は、選択部11が選択した測定対象に対応したアルゴリズムで画像を解析することで、選択された測定対象の可動域を示す測定値を算出する。アルゴリズムの詳細は、以下の実施形態で説明する。 The processing apparatus 10 stores in advance an algorithm indicating a method of analyzing image data corresponding to each of a plurality of measurement targets. As described above, the algorithm corresponding to each of the plurality of measurement targets specifies a predetermined part of the body (eg, elbow, shoulder), and is common in that a value indicating the range of motion is calculated based on the specified part. The part to be specified and the value (eg, angle, distance, etc.) calculated based on the specified part are different from each other. The calculation unit 12 analyzes the image using an algorithm corresponding to the measurement target selected by the selection unit 11 to calculate a measurement value indicating the movable range of the selected measurement target. Details of the algorithm will be described in the following embodiments.
 次に、図6のフローチャートを用いて、本実施形態の処理装置10の処理の流れの一例を説明する。 Next, an example of the processing flow of the processing device 10 of the present embodiment will be described with reference to the flowchart of FIG.
 まず、選択部11が、ユーザ入力に基づき、可動域の測定対象を選択する(S10)。次いで、算出部12が、被験者を撮影した画像を、S10で選択された測定対象に対応した解析方法で解析することで、選択された測定対象の可動域を示す測定値を算出する(S20)。 First, the selection unit 11 selects a measurement object of the movable range based on a user input (S10). Next, the calculation unit 12 calculates the measurement value indicating the movable range of the selected measurement target by analyzing the image obtained by photographing the subject using an analysis method corresponding to the measurement target selected in S10 (S20). .
 以上説明したように、本実施形態の処理装置10は、評価部位及びその部位の動作方向の少なくとも一方が互いに異なる複数の測定対象毎に画像データの解析方法(アルゴリズム)を用意しておき、選択された測定対象に対応する解析方法で被験者を撮影した画像を解析し、測定対象の可動域を示す測定値を算出する。このような処理装置10によれば、身体の複数の部位や複数の動作方向の可動域を、専門的な知識なしで評価できる。 As described above, the processing device 10 of the present embodiment prepares an image data analysis method (algorithm) for each of a plurality of measurement targets having at least one of the evaluation site and the operation direction of the site different from each other, An image obtained by photographing the subject is analyzed by an analysis method corresponding to the measured object, and a measurement value indicating a movable range of the object is calculated. According to such a processing device 10, it is possible to evaluate the movable range of a plurality of parts of the body and the plurality of movement directions without special knowledge.
 また、本実施形態の処理装置10によれば、コンピュータが予め定められたアルゴリズムに基づき画像を解析することで可動域が算出されるので、「人が可動域を評価する場合に比べて短時間で可動域が算出される」、「高い再現性が得られる」等の優れた効果が得られる。 In addition, according to the processing device 10 of the present embodiment, the range of motion is calculated by the computer analyzing the image based on the predetermined algorithm. , The movable range is calculated, and high reproducibility is obtained.
 また、本実施形態の処理装置10の一例によれば、ユーザ入力で動作(例:野球の投げる動作、野球の打つ動作、テニスのサーブの動作、テニスのフォアハンドの動作、テニスのボレーの動作等)が指定されると、指定された動作に関連する測定対象が選択される。この場合、ユーザが測定対象に関する専門的な知識を有さない場合であっても、自身に関連する動作を指定するという単純な作業を行うだけで、評価すべき測定対象の選択が実現される。すなわち、非専門性がより高くなる。 In addition, according to an example of the processing device 10 of the present embodiment, an operation by a user input (eg, a baseball throwing operation, a baseball hitting operation, a tennis serve operation, a tennis forehand operation, a tennis volley operation, and the like) When () is specified, a measurement object related to the specified operation is selected. In this case, even if the user does not have specialized knowledge about the measurement target, the selection of the measurement target to be evaluated is realized by simply performing the simple operation of designating the operation related to the user. . That is, the non-specialty becomes higher.
<第2の実施形態>
 本実施形態の処理装置10は、可動域を示す測定値の算出方法がより具体化される点で、第1の実施形態と異なる。以下、詳細に説明する。
<Second embodiment>
The processing apparatus 10 of the present embodiment is different from the first embodiment in that the method of calculating the measurement value indicating the movable range is more concretely performed. The details will be described below.
 処理装置10のハードウエア構成の一例は、第1の実施形態と同様である。 の 一 An example of the hardware configuration of the processing device 10 is the same as that of the first embodiment.
 処理装置10の機能ブロック図の一例は、第1の実施形態同様、図3で示される。図示するように、処理装置10は、選択部11と、算出部12とを有する。選択部11の構成は、第1の実施形態と同様である。 An example of a functional block diagram of the processing device 10 is shown in FIG. 3 as in the first embodiment. As illustrated, the processing device 10 includes a selection unit 11 and a calculation unit 12. The configuration of the selection unit 11 is the same as in the first embodiment.
 算出部12は、第1の実施形態同様、被験者を撮影した画像を、選択部11により選択された測定対象に対応した解析方法で解析することで、選択された測定対象の可動域を示す測定値を算出する。 As in the first embodiment, the calculation unit 12 analyzes the image of the subject by using an analysis method corresponding to the measurement target selected by the selection unit 11 to measure the range of motion of the selected measurement target. Calculate the value.
 具体的には、まず、算出部12は、被験者を画像上で特定する。画像内に存在する人を特定する手段は特段制限されず、あらゆる技術を利用して実現できる。 Specifically, first, the calculation unit 12 specifies the subject on the image. Means for identifying a person present in the image is not particularly limited, and can be realized using any technique.
 次いで、算出部12は、選択された測定対象に関連した身体の箇所を画像上で特定する。特定する箇所は、頭、首、左右の手先、左右の手の親指、左右の手首、左右の肘、左右の肩、左右の肩の真ん中、左右の股関節、左右の股関節の真ん中、骨盤の真ん中、脊柱の真ん中、左右の膝、左右の足首、左右の足先等が挙げられるが、これらに限定されない。これらの箇所は、身体全体の外観の特徴や、各箇所の外観の特徴等に基づき、画像解析で特定することができる。 Next, the calculation unit 12 specifies the location of the body related to the selected measurement target on the image. The location to specify is the head, neck, left and right hands, left and right thumbs, left and right wrists, left and right elbows, left and right shoulders, left and right shoulder middle, left and right hip joints, left and right hip joint middle, pelvic middle , The center of the spine, left and right knees, left and right ankles, left and right toes, etc., but are not limited thereto. These locations can be identified by image analysis based on the appearance characteristics of the entire body, the appearance characteristics of each location, and the like.
 次いで、算出部12は、特定した身体の箇所に基づき測定値を算出する。 Next, the calculation unit 12 calculates a measurement value based on the specified body part.
 例えば、算出部12は、身体の第1の箇所と第2の箇所とを結ぶ線と、基準線とのなす角を、可動域を示す測定値として算出してもよい。 For example, the calculation unit 12 may calculate an angle between a line connecting the first point and the second point of the body and a reference line as a measurement value indicating a movable range.
 その他、算出部12は、身体の第3の箇所と第4の箇所との距離を、可動域を示す測定値として算出してもよい。 In addition, the calculation unit 12 may calculate the distance between the third location and the fourth location of the body as a measurement value indicating the range of motion.
 その他、算出部12は、被験者が第1の姿勢をとっている時の身体の第5の箇所と、被験者が同位置で第1の姿勢と異なる第2の姿勢をとっている時の身体の第5の箇所との距離を、可動域を示す測定値として算出してもよい。 In addition, the calculation unit 12 calculates the fifth position of the body when the subject is in the first posture and the body of the body when the subject is in the second position different from the first posture at the same position. The distance to the fifth point may be calculated as a measured value indicating the movable range.
 なお、図7に示すように、予め、測定対象毎に特定する身体の箇所を示した情報が処理装置10に登録されていてもよい。そして、算出部12は、当該情報と、選択部11により選択された測定対象とに基づき、特定する身体の箇所を把握してもよい。また、算出部12は、図7に示す情報に基づき、算出する値を把握してもよい。 As shown in FIG. 7, information indicating the location of the body specified for each measurement target may be registered in the processing device 10 in advance. Then, the calculation unit 12 may grasp the location of the body to be specified based on the information and the measurement target selected by the selection unit 11. Further, the calculation unit 12 may grasp the value to be calculated based on the information shown in FIG.
 ここで、各測定対象を詳細に説明するとともに、各測定対象で特定される身体の箇所の一例と、画像解析方法の一例とを説明する。 Here, each measurement target will be described in detail, and an example of a body part specified by each measurement target and an example of an image analysis method will be described.
「肩関節屈曲(図5の「肩屈曲右」、「肩屈曲左」)」
 肩関節屈曲では、図8及び図9に示すように、被験者100は、3次元カメラ200(不図示)に正対し、気を付けの姿勢から手のひらを内側(身体側)に向けた状態で身体の前方を経由して腕を上方に上げていく。肩屈曲右では被験者100は右腕を動作させ、肩屈曲左では被験者100は左腕を動作させる。
"Shoulder flexion (" shoulder flexion right "," shoulder flexion left "in Fig. 5)"
In the shoulder joint flexion, as shown in FIG. 8 and FIG. 9, the subject 100 faces the three-dimensional camera 200 (not shown), and turns his / her body in a state in which the palm is turned to the inside (body side) from a careful posture. Raise your arm upwards through the front of the. The subject 100 operates the right arm with the shoulder flexion right, and operates the left arm with the shoulder flexion left.
 画像解析方法は、例えば次のようなもの考えられる。まず、算出部12は、画像解析で、動作させる側の肩と肘とを特定する(図8及び図9中、点で示す位置)。次いで、算出部12は、肩からY軸方向に下ろした線を基準線(図8中、Sで示す線)として定めるとともに、肩と肘とを結ぶ線を移動線として定める。なお、XYZ軸方向は図1及び図8に示す通りであるが、3次元カメラ200と被験者100とを結ぶ方向がz軸方向、垂直方向がY軸方向、水平方向であって3次元カメラ200の左右方向がX軸方向である。 The image analysis method is, for example, as follows. First, the calculation unit 12 specifies the shoulder and elbow on the side to be operated by image analysis (the positions indicated by the dots in FIGS. 8 and 9). Next, the calculation unit 12 determines a line lowered from the shoulder in the Y-axis direction as a reference line (a line indicated by S in FIG. 8), and determines a line connecting the shoulder and the elbow as a movement line. Note that the XYZ axis directions are as shown in FIGS. 1 and 8, but the direction connecting the three-dimensional camera 200 and the subject 100 is the z-axis direction, the vertical direction is the Y-axis direction, and the horizontal direction. Is the X-axis direction.
 そして、算出部12は、基準線をYZ平面に投影した線と、移動線をYZ平面に投影した線とのなす角を、肩関節屈曲の可動域を示す測定値として算出する。 算出 Then, the calculation unit 12 calculates an angle formed between a line obtained by projecting the reference line on the YZ plane and a line obtained by projecting the movement line on the YZ plane as a measured value indicating the movable range of the flexion of the shoulder joint.
 算出部12は、例えば、被験者100が上記一連の動作を行う様子を示す動画像を解析することができる。この場合、算出部12は、フレーム毎に画像を解析し、フレーム毎に上記なす角を算出することができる。そして、算出部12は、フレーム毎に算出した複数のなす角の中の最大値を、肩関節屈曲の可動域を示す測定値として算出することができる。 The calculation unit 12 can analyze, for example, a moving image showing a state in which the subject 100 performs the above-described series of operations. In this case, the calculation unit 12 can analyze the image for each frame and calculate the angle formed for each frame. Then, the calculation unit 12 can calculate the maximum value among the plurality of angles calculated for each frame as a measurement value indicating the movable range of the flexion of the shoulder joint.
 なお、ここで説明した画像解析方法は一例であり、これに準じた他の方法を採用することもできる。 The image analysis method described here is an example, and another method according to the image analysis method can be adopted.
「肩関節外旋(図5の「肩外旋右」、「肩外旋左」)」
 肩関節外旋では、図10及び図11に示すように、被験者100は、3次元カメラ200(不図示)に正対し、外転90°の姿勢から手先が上を向くように肘を90°曲げ、手のひらが内側を向いた状態(手の親指が体の後方を向く状態)で肘から先を体の後方側に倒していく。肩外旋右では被験者100は右腕を動作させ、肩外旋左では被験者100は左腕を動作させる。
"Shoulder external rotation (Fig. 5" Shoulder external rotation right "," Shoulder external rotation left ")"
In the shoulder external rotation, as shown in FIGS. 10 and 11, the subject 100 faces the three-dimensional camera 200 (not shown), and turns the elbow 90 ° so that the hand faces upward from the posture of the abduction 90 °. Bend, with the palm facing inward (the thumb of the hand is facing the back of the body), tip from the elbow to the back of the body. The subject 100 operates the right arm when the shoulder external rotation is right, and the subject 100 operates the left arm when the shoulder external rotation is left.
 画像解析方法は、例えば次のようなもの考えられる。まず、算出部12は、画像解析で、動作させる側の肘と手首とを特定する(図10及び図11中、点で示す位置)。次いで、算出部12は、肘からZ軸方向(身体の前方側)に伸ばした線を基準線(図10中、Sで示す線)として定めるとともに、肘と手首とを結ぶ線を移動線として定める。 The image analysis method is, for example, as follows. First, the calculation unit 12 specifies an elbow and a wrist on the side to be operated by image analysis (positions indicated by dots in FIGS. 10 and 11). Next, the calculation unit 12 determines a line extending from the elbow in the Z-axis direction (the front side of the body) as a reference line (a line indicated by S in FIG. 10), and a line connecting the elbow and the wrist as a movement line. Determine.
 そして、算出部12は、基準線をYZ平面に投影した線と、移動線をYZ平面に投影した線とのなす角を、肩関節外旋の可動域を示す測定値として算出する。 算出 Then, the calculation unit 12 calculates an angle formed by a line obtained by projecting the reference line on the YZ plane and a line obtained by projecting the movement line on the YZ plane as a measurement value indicating the movable range of the external rotation of the shoulder joint.
 算出部12は、例えば、被験者100が上記一連の動作を行う様子を示す動画像を解析することができる。この場合、算出部12は、フレーム毎に画像を解析し、フレーム毎に上記なす角を算出することができる。そして、算出部12は、フレーム毎に算出した複数のなす角の中の最大値を、肩関節外旋の可動域を示す測定値として算出することができる。 The calculation unit 12 can analyze, for example, a moving image showing a state in which the subject 100 performs the above-described series of operations. In this case, the calculation unit 12 can analyze the image for each frame and calculate the angle formed for each frame. Then, the calculation unit 12 can calculate the maximum value among the plurality of angles formed for each frame as a measurement value indicating the movable range of the external rotation of the shoulder joint.
 なお、ここで説明した画像解析方法は一例であり、これに準じた他の方法を採用することもできる。 The image analysis method described here is an example, and another method according to the image analysis method can be adopted.
「肩関節内旋(図5の「肩内旋右」、「肩内旋左」)」
 肩関節内旋では、図12及び図13に示すように、被験者100は、3次元カメラ200(不図示)に正対し、外転90°の姿勢から手先が上を向くように肘を90°曲げ、手のひらが外側を向いた状態(手の親指が身体の前方を向く状態)で肘から先を身体の前方側に倒していく。肩内旋右では被験者100は右腕を動作させ、肩内旋左では被験者100は左腕を動作させる。
"Shoulder internal rotation (" shoulder internal rotation right "," shoulder internal rotation left "in Fig. 5))
In the internal rotation of the shoulder joint, as shown in FIGS. 12 and 13, the subject 100 faces the three-dimensional camera 200 (not shown), and turns the elbow 90 ° so that the hand faces upward from the posture of the abduction 90 °. Bend, with the palm facing outward (the thumb of the hand facing the front of the body), tip the elbow to the front of the body. The subject 100 operates the right arm while the shoulder rotation is right, and the subject 100 operates the left arm while the shoulder rotation is left.
 画像解析方法は、例えば次のようなもの考えられる。まず、算出部12は、画像解析で、動作させる側の肘と手首とを特定する(図12及び図13中、点で示す位置)。次いで、算出部12は、肘からZ軸方向(身体の後方側)に伸ばした線を基準線(図12中、Sで示す線)として定めるとともに、肘と手首とを結ぶ線を移動線として定める。 The image analysis method is, for example, as follows. First, the calculation unit 12 specifies the elbow and the wrist on the side to be operated by image analysis (positions indicated by dots in FIGS. 12 and 13). Next, the calculation unit 12 determines a line extending from the elbow in the Z-axis direction (rear side of the body) as a reference line (a line indicated by S in FIG. 12), and a line connecting the elbow and the wrist as a movement line. Determine.
 そして、算出部12は、基準線をYZ平面に投影した線と、移動線をYZ平面に投影した線とのなす角を、肩関節内旋の可動域を示す測定値として算出する。 算出 Then, the calculation unit 12 calculates an angle formed by a line obtained by projecting the reference line on the YZ plane and a line obtained by projecting the movement line on the YZ plane as a measured value indicating the movable range of the internal rotation of the shoulder joint.
 算出部12は、例えば、被験者100が上記一連の動作を行う様子を示す動画像を解析することができる。この場合、算出部12は、フレーム毎に画像を解析し、フレーム毎に上記なす角を算出することができる。そして、算出部12は、フレーム毎に算出した複数のなす角の中の最大値を、肩関節内旋の可動域を示す測定値として算出することができる。 The calculation unit 12 can analyze, for example, a moving image showing a state in which the subject 100 performs the above-described series of operations. In this case, the calculation unit 12 can analyze the image for each frame and calculate the angle formed for each frame. Then, the calculation unit 12 can calculate the maximum value among the plurality of angles formed for each frame as a measurement value indicating the movable range of the internal rotation of the shoulder joint.
 なお、ここで説明した画像解析方法は一例であり、これに準じた他の方法を採用することもできる。 The image analysis method described here is an example, and another method according to the image analysis method can be adopted.
「股関節屈曲(図5の「股屈曲右」、「股屈曲左」)」
 股関節屈曲は、図14及び図15に示すように、被験者100は、3次元カメラ200(不図示)に正対し、気を付けの姿勢から膝を胸に近づけるように足を動かす。股屈曲右では被験者100は右足を動作させ、股屈曲左では被験者100は左足を動作させる。
"Hip joint flexion (" hip flexion right "," hip flexion left "in FIG. 5))
As shown in FIGS. 14 and 15, the hip flexion causes the subject 100 to face the three-dimensional camera 200 (not shown), and move his / her feet from a cautious posture to bring the knees closer to the chest. The subject 100 operates the right leg with the hip flexion right, and the subject 100 operates the left leg with the hip flexion left.
 画像解析方法は、例えば次のようなもの考えられる。まず、算出部12は、画像解析で、動作させる側の股関節と膝とを特定する(図14及び図15中、点で示す位置)。次いで、算出部12は、股関節からY軸方向に下ろした線を基準線(図14中、Sで示す線)として定めるとともに、股関節と膝とを結ぶ線を移動線として定める。 The image analysis method is, for example, as follows. First, the calculation unit 12 specifies the hip joint and the knee on the side to be operated by image analysis (the positions indicated by the dots in FIGS. 14 and 15). Next, the calculation unit 12 determines a line lowered from the hip joint in the Y-axis direction as a reference line (a line indicated by S in FIG. 14), and determines a line connecting the hip joint and the knee as a movement line.
 そして、算出部12は、基準線をYZ平面に投影した線と、移動線をYZ平面に投影した線とのなす角を、股関節屈曲の可動域を示す測定値として算出する。 {Circle around (5)} Then, the calculation unit 12 calculates an angle between a line obtained by projecting the reference line on the YZ plane and a line obtained by projecting the movement line on the YZ plane as a measurement value indicating the movable range of the hip joint flexion.
 算出部12は、例えば、被験者100が上記一連の動作を行う様子を示す動画像を解析することができる。この場合、算出部12は、フレーム毎に画像を解析し、フレーム毎に上記なす角を算出することができる。そして、算出部12は、フレーム毎に算出した複数のなす角の中の最大値を、股関節屈曲の可動域を示す測定値として算出することができる。 The calculation unit 12 can analyze, for example, a moving image showing a state in which the subject 100 performs the above-described series of operations. In this case, the calculation unit 12 can analyze the image for each frame and calculate the angle formed for each frame. Then, the calculation unit 12 can calculate the maximum value of the plurality of angles formed for each frame as a measurement value indicating the movable range of hip joint flexion.
 なお、ここで説明した画像解析方法は一例であり、これに準じた他の方法を採用することもできる。 The image analysis method described here is an example, and another method according to the image analysis method can be adopted.
「体幹部回旋(図5の「体幹回旋右」、「体幹回旋左」)」
 体幹部回旋では、図16及び図17に示すように、被験者100は、3次元カメラ200(不図示)に正対し、両手を頭部で組んだ座位で上半身を回旋させる。その際、被験者100は、回旋方向に体重移動させる。体幹回旋右では被験者100は右方向に上半身を回旋させ、体幹回旋左では被験者100は左方向に上半身を回旋させる。
"Torso rotation (" Torso rotation right "," Torso rotation left "in Fig. 5))
In the torso rotation, as shown in FIGS. 16 and 17, the subject 100 faces the three-dimensional camera 200 (not shown) and turns his / her upper body in a sitting position with his / her hands folded with his / her head. At that time, the subject 100 shifts the weight in the rotation direction. At the trunk rotation right, the subject 100 rotates the upper body to the right, and at the trunk rotation left, the subject 100 rotates the upper body to the left.
 画像解析方法は、例えば次のようなもの考えられる。まず、算出部12は、画像解析で、両股関節と、両肩とを特定する(図16及び図17中、点で示す位置)。次いで、算出部12は、両股関節を結ぶ線を基準線(図16中、Sで示す線)として定めるとともに、両肩を結ぶ線を移動線として定める。 The image analysis method is, for example, as follows. First, the calculation unit 12 specifies both hip joints and both shoulders by image analysis (positions indicated by dots in FIGS. 16 and 17). Next, the calculation unit 12 determines a line connecting both hip joints as a reference line (a line indicated by S in FIG. 16) and a line connecting both shoulders as a movement line.
 そして、算出部12は、基準線をXZ平面に投影した線と、移動線をXZ平面に投影した線とのなす角を、体幹部外旋の可動域を示す測定値として算出する。 算出 Then, the calculation unit 12 calculates an angle between a line obtained by projecting the reference line on the XZ plane and a line obtained by projecting the movement line on the XZ plane, as a measurement value indicating the movable range of the trunk external rotation.
 算出部12は、例えば、被験者100が上記一連の動作を行う様子を示す動画像を解析することができる。この場合、算出部12は、フレーム毎に画像を解析し、フレーム毎に上記なす角を算出することができる。そして、算出部12は、フレーム毎に算出した複数のなす角の中の最大値を、体幹部回旋の可動域を示す測定値として算出することができる。 The calculation unit 12 can analyze, for example, a moving image showing a state in which the subject 100 performs the above-described series of operations. In this case, the calculation unit 12 can analyze the image for each frame and calculate the angle formed for each frame. Then, the calculation unit 12 can calculate the maximum value among the plurality of angles formed for each frame as a measurement value indicating the movable range of the torso rotation.
 なお、ここで説明した画像解析方法は一例であり、これに準じた他の方法を採用することもできる。 The image analysis method described here is an example, and another method according to the image analysis method can be adopted.
「下肢リーチ側方(図5の「下肢リーチ側方」)」
 下肢リーチ側方では、図18及び図19に示すように、被験者100は、3次元カメラ200(不図示)に正対し、手を腰に当てて片足立ちし、そこから浮いている足を地面につかないように最大限側方側(X軸方向側)に伸ばして足先を身体から離した後、元の状態に戻る。なお、下肢リーチ側方では、左足で立って右足を伸ばす測定と、右足で立って左足を伸ばす測定との2パターンを行うことができる。
"Leg Reach Lateral (" Leg Reach Lateral "in Figure 5)"
On the side of the lower limb reach, as shown in FIGS. 18 and 19, the subject 100 faces the three-dimensional camera 200 (not shown), stands on one leg with his / her hand on the waist, and lifts the foot floating therefrom on the ground. It extends to the side (X-axis direction side) as much as possible so that it does not get in contact with the body and then returns to its original state. On the side of the lower limb reach, two patterns of a measurement of standing with the left foot and extending the right foot and a measurement of standing with the right foot and extending the left foot can be performed.
 画像解析方法は、例えば次のようなもの考えられる。まず、算出部12は、画像解析で、左右の足先を特定する(図18及び図19中、点で示す位置)。そして、算出部12は、左右の足先のX軸方向の距離を、下肢リーチ側方の可動域を示す測定値として算出する。 The image analysis method is, for example, as follows. First, the calculation unit 12 specifies left and right toes by image analysis (positions indicated by dots in FIGS. 18 and 19). Then, the calculation unit 12 calculates the distance between the left and right toes in the X-axis direction as a measurement value indicating the movable range of the lower limb reach side.
 算出部12は、例えば、被験者100が上記一連の動作を行う様子を示す動画像を解析することができる。この場合、算出部12は、フレーム毎に画像を解析し、フレーム毎に上記距離を算出することができる。そして、算出部12は、フレーム毎に算出した複数の距離の中の最大値を、下肢リーチ側方の可動域を示す測定値として算出することができる。 The calculation unit 12 can analyze, for example, a moving image showing a state in which the subject 100 performs the above-described series of operations. In this case, the calculation unit 12 can analyze the image for each frame and calculate the distance for each frame. Then, the calculation unit 12 can calculate the maximum value among the plurality of distances calculated for each frame as a measurement value indicating the movable range of the lower limb reach side.
 なお、ここで説明した画像解析方法は一例であり、これに準じた他の方法を採用することもできる。 The image analysis method described here is an example, and another method according to the image analysis method can be adopted.
「下肢リーチ後方(図5の「下肢リーチ後方」)」
 下肢リーチ後方では、図20及び図21に示すように、被験者100は、X軸方向を向いて、手を腰に当てて片足立ちし、そこから浮いている足を地面につかないように最大限後方側(X軸方向側)に伸ばして足先を身体から離した後、元の状態に戻る。なお、下肢リーチ後方では、左足で立って右足を伸ばす測定と、右足で立って左足を伸ばす測定との2パターンを行うことができる。
"Back of lower leg reach (" Back of lower leg reach "in Fig. 5)"
At the rear of the lower limb reach, as shown in FIG. 20 and FIG. 21, the subject 100 faces the X-axis direction, stands on one leg with his / her hand on his / her waist, and puts a foot floating therefrom as far as possible without touching the ground. After extending to the rear side (X-axis direction side) to separate the toe from the body, it returns to the original state. In addition, behind the lower limb reach, two patterns of a measurement of standing with the left foot and extending the right foot and a measurement of standing with the right foot and extending the left foot can be performed.
 画像解析方法は、例えば次のようなもの考えられる。まず、算出部12は、画像解析で、左右の足先を特定する(図20及び図21中、点で示す位置)。そして、算出部12は、左右の足先のX軸方向の距離を、下肢リーチ後方の可動域を示す測定値として算出する。 The image analysis method is, for example, as follows. First, the calculation unit 12 specifies left and right toes by image analysis (positions indicated by dots in FIGS. 20 and 21). Then, the calculation unit 12 calculates the distance between the left and right toes in the X-axis direction as a measurement value indicating the movable range behind the lower limb reach.
 算出部12は、例えば、被験者100が上記一連の動作を行う様子を示す動画像を解析することができる。この場合、算出部12は、フレーム毎に画像を解析し、フレーム毎に上記距離を算出することができる。そして、算出部12は、フレーム毎に算出した複数の距離の中の最大値を、下肢リーチ後方の可動域を示す測定値として算出することができる。 The calculation unit 12 can analyze, for example, a moving image showing a state in which the subject 100 performs the above-described series of operations. In this case, the calculation unit 12 can analyze the image for each frame and calculate the distance for each frame. Then, the calculation unit 12 can calculate the maximum value among the plurality of distances calculated for each frame as a measurement value indicating the range of motion behind the lower limb reach.
 なお、ここで説明した画像解析方法は一例であり、これに準じた他の方法を採用することもできる。 The image analysis method described here is an example, and another method according to the image analysis method can be adopted.
「上肢リーチ(図5の「上肢リーチ」)」
 上肢リーチでは、図22及び図23に示すように、被験者100は、X軸方向を向いて計測位置に測定側の足を付けた最初の状態から3歩後ろに下がる。そこから、被験者100は、非測定側の足を最初の状態の位置に戻し、両腕を前方に伸ばした第1の姿勢(図22において点線で示す姿勢)をとる。次いで、その位置で全身を前方に伸ばし、両腕の先の位置を最大限遠方に離した第2の姿勢(図22において実線で示す姿勢)をとる。なお、上肢リーチでは、右足を測定側の足とした測定と、左足を測定側の足とした測定との2パターンを行うことができる。
"Upper limb reach (" Upper limb reach "in Fig. 5)"
In the upper limb reach, as shown in FIGS. 22 and 23, the subject 100 falls three steps backward from the initial state in which the measurement side foot is attached to the measurement position in the X-axis direction. From there, the subject 100 returns the non-measurement-side leg to the initial position, and takes a first posture (the posture indicated by the dotted line in FIG. 22) with both arms extended forward. Next, the entire body is extended forward at that position, and a second posture (posture indicated by a solid line in FIG. 22) is taken in which the positions of the ends of both arms are maximally far apart. In addition, in the upper limb reach, two patterns of a measurement using the right foot as the measurement-side foot and a measurement using the left foot as the measurement-side foot can be performed.
 画像解析方法は、例えば次のようなもの考えられる。まず、算出部12は、画像解析で、第1の姿勢時の手先及び第2の姿勢時の手先を特定する(図22及び図23中、点で示す位置)。そして、算出部12は、第1の姿勢時の手先と、第2の姿勢時の手先とのX軸方向の距離を、上肢リーチの可動域を示す測定値として算出する。 The image analysis method is, for example, as follows. First, the calculation unit 12 specifies a hand in the first posture and a hand in the second posture by image analysis (positions indicated by dots in FIGS. 22 and 23). Then, the calculation unit 12 calculates a distance in the X-axis direction between the hand in the first posture and the hand in the second posture as a measurement value indicating the movable range of the upper limb reach.
 算出部12は、例えば、被験者100が第1の姿勢から第2の姿勢になるまでの一連の動作を示す動画像を解析することができる。この場合、算出部12は、フレーム毎に画像を解析し、フレーム毎に上記距離を算出することができる。そして、算出部12は、フレーム毎に算出した複数の距離の中の最大値を、上肢リーチの可動域を示す測定値として算出することができる。 The calculation unit 12 can analyze, for example, a moving image showing a series of operations from the first posture to the second posture of the subject 100. In this case, the calculation unit 12 can analyze the image for each frame and calculate the distance for each frame. Then, the calculation unit 12 can calculate the maximum value among the plurality of distances calculated for each frame as a measurement value indicating the movable range of the upper limb reach.
 なお、ここで説明した画像解析方法は一例であり、これに準じた他の方法を採用することもできる。 The image analysis method described here is an example, and another method according to the image analysis method can be adopted.
「T字バランス(図5の「T字バランス左支持」、「T字バランス右支持」)」
 T字バランスでは、図24及び図25に示すように、被験者100は、X軸方向を向いて片足立ちし、手と足を可能な限り床と平行にする。T字バランス左支持では、被験者100は左足で立ち、T字バランス右支持では、被験者100は右足で立つ。
"T-shaped balance (" T-shaped balance left support "," T-shaped balance right support "in Fig. 5)"
In the T-shaped balance, as shown in FIGS. 24 and 25, the subject 100 stands on one foot in the X-axis direction and makes his / her hands and feet as parallel to the floor as possible. In the T-shaped balance left support, the subject 100 stands on the left foot, and in the T-shaped balance right support, the subject 100 stands on the right foot.
 画像解析方法は、例えば次のようなもの考えられる。まず、算出部12は、画像解析で、肩と、肘と、股関節と、膝とを特定する(図24及び図25中、点で示す位置)。次いで、算出部12は、肩からX軸方向(腕と反対側に伸びる方向)に伸びる線を基準線(図24中、Sで示す線)として定めるとともに、この基準線とのなす角を算出するための移動線として肩と肘とを結ぶ線を定める。また、算出部12は、股関節からX軸方向(足と反対側に伸びる方向)に伸びる線を基準線(図24中、Sで示す線)として定めるとともに、この基準線とのなす角を算出するための移動線として股関節と膝とを結ぶ線を定める。 The image analysis method is, for example, as follows. First, the calculation unit 12 specifies a shoulder, an elbow, a hip joint, and a knee by image analysis (positions indicated by dots in FIGS. 24 and 25). Next, the calculation unit 12 determines a line extending from the shoulder in the X-axis direction (a direction extending in a direction opposite to the arm) as a reference line (line indicated by S in FIG. 24) and calculates an angle between the reference line and the reference line. A line connecting the shoulder and the elbow is determined as a movement line for the movement. The calculation unit 12 determines a line extending from the hip joint in the X-axis direction (a direction extending in a direction opposite to the foot) as a reference line (a line indicated by S in FIG. 24), and calculates an angle between the reference line and the reference line. A line connecting the hip joint and the knee is determined as a movement line for performing the movement.
 そして、算出部12は、基準線をXY平面に投影した線と、移動線をXY平面に投影した線とのなす角を、T字バランスの可動域を示す測定値として算出する。 {Circle around (5)} Then, the calculation unit 12 calculates an angle between a line obtained by projecting the reference line on the XY plane and a line obtained by projecting the movement line on the XY plane as a measurement value indicating the movable range of the T-shaped balance.
 算出部12は、例えば、被験者100が上記一連の動作を行う様子を示す動画像を解析することができる。この場合、算出部12は、フレーム毎に画像を解析し、フレーム毎に上記なす角を算出することができる。そして、算出部12は、フレーム毎に算出した複数のなす角の中の最大値を、T字バランスの可動域を示す測定値として算出することができる。 The calculation unit 12 can analyze, for example, a moving image showing a state in which the subject 100 performs the above-described series of operations. In this case, the calculation unit 12 can analyze the image for each frame and calculate the angle formed for each frame. Then, the calculation unit 12 can calculate the maximum value among the plurality of angles formed for each frame as a measurement value indicating the movable range of the T-shaped balance.
 なお、ここで説明した画像解析方法は一例であり、これに準じた他の方法を採用することもできる。 The image analysis method described here is an example, and another method according to the image analysis method can be adopted.
 次に、本実施形態の処理装置10の処理の流れの一例は第1の実施形態同様、図6のフローチャートで示される。ここで、図6のS20の処理の流れの一例を、図26のフローチャートを用いて説明する。 Next, an example of the processing flow of the processing device 10 of the present embodiment is shown in the flowchart of FIG. 6 as in the first embodiment. Here, an example of the flow of the process of S20 in FIG. 6 will be described with reference to the flowchart in FIG.
 まず、算出部12は、被験者を画像上で特定する(S21)。次いで、算出部12は、S10で選択された測定対象に関連した身体の箇所を画像上で特定する(S22)。次いで、算出部12は、S22で特定した箇所に基づき測定値を算出する(S23)。 First, the calculation unit 12 specifies the subject on the image (S21). Next, the calculation unit 12 specifies the location of the body related to the measurement target selected in S10 on the image (S22). Next, the calculation unit 12 calculates a measured value based on the location specified in S22 (S23).
 以上説明した本実施形態の処理装置10によれば、第1の実施形態と同様な作用効果を実現できる。 According to the processing apparatus 10 of the present embodiment described above, the same functions and effects as those of the first embodiment can be realized.
 また、本実施形態の処理装置10によれば、画像解析で測定対象に関連した身体の箇所を特定した後、特定した箇所に基づき測定値を算出することができる。例えば、2つの箇所を結ぶ線と基準線とのなす角を算出したり、2つの箇所の距離を算出したり、第1の姿勢を取っている時の箇所と同位置で第2の姿勢を取っている時の同箇所との距離を算出したりできる。このような処理装置10によれば、身体の様々な部位の様々な動作方向の可動域を測定することができる。 According to the processing device 10 of the present embodiment, after specifying a location of the body related to the measurement target by the image analysis, the measurement value can be calculated based on the specified location. For example, the angle between the line connecting the two points and the reference line is calculated, the distance between the two points is calculated, or the second position is set at the same position as the position when the first position is taken. It is possible to calculate the distance to the same place when taking. According to such a processing device 10, it is possible to measure the movable range of various parts of the body in various operation directions.
<第3の実施形態>
 本実施形態の処理装置10は、可動域を示す測定値に基づき、測定対象の状態を評価する機能を有する点で、第1及び第2の実施形態と異なる。以下、詳細に説明する。
<Third embodiment>
The processing apparatus 10 of the present embodiment is different from the first and second embodiments in that the processing apparatus 10 has a function of evaluating a state of a measurement target based on a measurement value indicating a movable range. The details will be described below.
 処理装置10のハードウエア構成の一例は、第1及び第2の実施形態と同様である。 の 一 An example of the hardware configuration of the processing device 10 is the same as in the first and second embodiments.
 処理装置10の機能ブロック図の一例は、図27で示される。図示するように、処理装置10は、選択部11と、算出部12と、評価部13とを有する。選択部11及び算出部12の構成は、第1及び第2の実施形態と同様である。 An example of a functional block diagram of the processing device 10 is shown in FIG. As illustrated, the processing device 10 includes a selection unit 11, a calculation unit 12, and an evaluation unit 13. The configurations of the selection unit 11 and the calculation unit 12 are the same as those in the first and second embodiments.
 評価部13は、算出部12が算出した測定対象の可動域を示す測定値に基づき、測定対象の状態を評価する。評価部13は、以下で説明する第1乃至第3の評価方法の中の少なくとも1つを実行することができる。 The evaluation unit 13 evaluates the state of the measurement target based on the measurement value calculated by the calculation unit 12 and indicating the range of motion of the measurement target. The evaluation unit 13 can execute at least one of the first to third evaluation methods described below.
「第1の評価方法」
 測定対象の測定値が、身体の左右に対応して2パターン算出された場合、評価部13は、2パターンの測定値の差を算出し、当該差に基づき測定対象の状態を評価することができる。差が小さい程、左右のバランスが適切な状態であり、差が大きい程、左右のバランスが崩れている状態である。第1の評価方法によれば、身体の左右のバランスが適切な状態か否かを評価することができる。
"First evaluation method"
When two patterns of measurement values of the measurement target are calculated corresponding to the right and left sides of the body, the evaluation unit 13 calculates a difference between the measurement values of the two patterns and evaluates the state of the measurement target based on the difference. it can. The smaller the difference, the more appropriate the left / right balance is, and the larger the difference, the more the left / right balance is lost. According to the first evaluation method, it is possible to evaluate whether the right and left balance of the body is in an appropriate state.
 「身体の左右に対応して測定値を2パターン算出」は、例えば、「肩屈曲右と肩屈曲左の測定値を算出」、「肩外旋右と肩外旋左の測定値を算出」、「肩内旋右と肩内旋左の測定値を算出」、「股屈曲右と股屈曲左の測定値を算出」、「体幹回旋右と体幹回旋左の測定値を算出」、「下肢リーチ側方において、左足で立って右足を伸ばす測定値と、右足で立って左足を伸ばす測定値とを算出」、「下肢リーチ後方において、左足で立って右足を伸ばす測定値と、右足で立って左足を伸ばす測定値とを算出」、「上肢リーチにおいて、右足を測定側の足とした測定値と、左足を測定側の足とした測定値とを算出」、「T字バランス左支持とT字バランス右支持の測定値を算出」等が例示される。 “Calculating two patterns of measured values corresponding to the left and right sides of the body” is, for example, “calculate measured values of right shoulder flexion and left shoulder flexion”, “calculate measured values of right shoulder external rotation and left shoulder external rotation” , `` Calculate measured values of shoulder rotation right and shoulder rotation left '', `` Calculate measured values of crotch flexion right and crotch flexion left '', `` Calculate measured values of trunk rotation right and trunk rotation left '', `` Calculate the measured value of standing on the left leg and extending the right leg, and the measured value of standing on the right leg and extending the left leg on the side of the lower limb reach '', `` The measured value of standing on the left leg and extending the right leg behind the reach of the lower limb, Calculate the measured value of standing and stretching the left foot, "Calculate the measured value of the upper limb reach using the right foot as the measuring foot, and the measured value of the left foot as the measuring foot," "T-shaped balance left Calculate measured values of support and T-shaped balance right support "and the like.
 例えば、評価部13は、2パターンの測定値の差を複数レベルに分類し、レベル毎に測定対象の状態を定義した第1の状態定義情報を保持しておいてもよい。第1の状態定義情報は、例えば、差がD0~D1は「良好」、差がD1~D2は「やや危険」、差がD2~D3は「危険」等である。そして、評価部13は、第1の状態定義情報と、2パターンの測定値の差に基づき、各測定対象の状態を評価してもよい。 For example, the evaluation unit 13 may classify the difference between the measured values of the two patterns into a plurality of levels, and hold first state definition information defining the state of the measurement target for each level. The first state definition information includes, for example, “good” for the difference D0 to D1, “slightly dangerous” for the difference D1 to D2, and “dangerous” for the difference D2 to D3. Then, the evaluation unit 13 may evaluate the state of each measurement target based on the first state definition information and the difference between the measured values of the two patterns.
 なお、第1の状態定義情報は、測定対象毎に定義されてもよい。すなわち、「良好」、「やや危険」、「危険」等の各状態となる差の数値範囲は、測定対象毎に異なってもよい。 Note that the first state definition information may be defined for each measurement target. In other words, the numerical range of the difference in each state such as “good”, “slightly dangerous”, and “dangerous” may be different for each measurement target.
「第2の評価方法」
 評価部13は、測定値と参照値との比較結果に基づき、測定対象の状態を評価することができる。
"Second evaluation method"
The evaluation unit 13 can evaluate the state of the measurement target based on the comparison result between the measured value and the reference value.
 例えば、評価部13は、測定値を複数レベルに分類し、レベル毎に測定対象の状態を定義した第2の状態定義情報を保持しておいてもよい。第2の状態定義情報は、例えば、測定値がC0~C1は「良好」、測定値がC1~C2は「やや危険」、測定値がC2~C3は「危険」等である。そして、評価部13は、第2の状態定義情報と、測定値とに基づき、各測定対象の状態を評価してもよい。 For example, the evaluation unit 13 may classify the measured values into a plurality of levels, and hold second state definition information that defines the state of the measurement target for each level. The second state definition information includes, for example, “good” for measured values C0 to C1, “slightly dangerous” for measured values C1 to C2, and “dangerous” for measured values C2 to C3. Then, the evaluation unit 13 may evaluate the state of each measurement target based on the second state definition information and the measurement value.
 なお、第2の状態定義情報は、測定対象毎に定義されてもよい。すなわち、「良好」、「やや危険」、「危険」等の各状態となる測定値の数値範囲は、測定対象毎に異なってもよい。また、第2の状態定義情報は、被験者の属性毎(例:年齢ごと、性別ごと)に定義されてもよい。すなわち、「良好」、「やや危険」、「危険」等の各状態となる測定値の数値範囲は、被験者の属性毎(例:年齢ごと、性別ごと)に異なってもよい。この場合、処理装置10は任意の手段で被験者の属性を指定する値の入力を受付け、評価部13は入力された値に基づき被験者の属性を特定し、特定した属性に対応した第2の状態定義情報を用いて測定対象の状態を評価する。 The second state definition information may be defined for each measurement target. That is, the numerical value ranges of the measured values in each state such as “good”, “slightly dangerous”, and “dangerous” may be different for each measurement target. In addition, the second state definition information may be defined for each attribute of the subject (for example, for each age and each sex). That is, the numerical value ranges of the measured values in each of the states such as “good”, “slightly dangerous”, and “dangerous” may be different for each attribute of the subject (eg, for each age, each gender). In this case, the processing device 10 receives an input of a value specifying the attribute of the subject by any means, the evaluation unit 13 specifies the attribute of the subject based on the input value, and the second state corresponding to the specified attribute. The state of the measurement target is evaluated using the definition information.
「第3の評価方法」
 評価部13は、同被験者の測定値(例:最新の測定値)と過去の測定値に基づき、測定対象の状態を評価する。第3の評価方法によれば、被験者の測定対象の状態の変化(悪化、現状維持、良化等)を評価することができる。
"Third evaluation method"
The evaluation unit 13 evaluates the state of the measurement target based on the measurement value (eg, the latest measurement value) of the subject and the past measurement value. According to the third evaluation method, it is possible to evaluate a change (deterioration, maintenance of the current state, improvement, etc.) of the state of the measurement target of the subject.
 例えば、評価部13は、過去の測定値に基づき測定対象の状態を定義した第3の状態定義情報を保持しておいてもよい。第3の状態定義情報は、例えば、測定値が「過去の測定値マイナスα以上過去の測定値プラスα以下」の場合は「現状維持」、測定値が「過去の測定値マイナスα未満」の場合は「悪化」、測定値が「過去の測定値プラスαより大」の場合は「良化」等である。そして、評価部13は、第3の状態定義情報と、過去の測定値と、例えば最新の測定値とに基づき、各測定対象の状態を評価してもよい。 For example, the evaluation unit 13 may hold third state definition information that defines the state of the measurement target based on past measurement values. The third state definition information includes, for example, when the measured value is “past measured value minus α or more and past measured value plus α or less”, the current state is “maintained”, and the measured value is “past measured value minus less than α”. In this case, the result is “deteriorated”, and when the measured value is “greater than the past measured value plus α”, it is “improved”. Then, the evaluation unit 13 may evaluate the state of each measurement target based on the third state definition information, the past measurement value, and, for example, the latest measurement value.
 なお、第3の状態定義情報は、測定対象毎に定義されてもよい。すなわち、上記αの値は、測定対象毎に異なってもよい。また、第3の状態定義情報は、被験者の属性毎(例:年齢ごと、性別ごと)に定義されてもよい。すなわち、上記αの値は、被験者の属性毎(例:年齢ごと、性別ごと)に異なってもよい。この場合、処理装置10は任意の手段で被験者の属性を指定する値の入力を受付け、評価部13は入力された値に基づき被験者の属性を特定し、特定した属性に対応した第3の状態定義情報を用いて測定対象の状態を評価する。 The third state definition information may be defined for each measurement target. That is, the value of α may be different for each measurement target. Further, the third state definition information may be defined for each attribute of the subject (for example, for each age and each gender). That is, the value of α may be different for each attribute of the subject (eg, for each age, each gender). In this case, the processing device 10 receives an input of a value specifying the attribute of the subject by any means, the evaluation unit 13 specifies the attribute of the subject based on the input value, and the third state corresponding to the specified attribute. The state of the measurement target is evaluated using the definition information.
 次に、図28のフローチャートを用いて、本実施形態の処理装置10の処理の流れの一例を説明する。 Next, an example of a processing flow of the processing device 10 of the present embodiment will be described with reference to the flowchart in FIG.
 まず、選択部11が、ユーザ入力に基づき、可動域の測定対象を選択する(S10)。次いで、算出部12が、被験者を撮影した画像を、S10で選択された測定対象に対応した解析方法で解析することで、選択された測定対象の可動域を示す測定値を算出する(S20)。次いで、評価部13が、S20で算出された測定値に基づき、測定対象の状態を評価する(S30)。 First, the selection unit 11 selects a measurement object of the movable range based on a user input (S10). Next, the calculation unit 12 calculates the measurement value indicating the movable range of the selected measurement target by analyzing the image obtained by photographing the subject using an analysis method corresponding to the measurement target selected in S10 (S20). . Next, the evaluation unit 13 evaluates the state of the measurement target based on the measurement value calculated in S20 (S30).
 以上説明した本実施形態の処理装置10によれば、第1及び第2の実施形態と同様な作用効果を実現できる。 According to the processing apparatus 10 of the present embodiment described above, the same functions and effects as those of the first and second embodiments can be realized.
 また、本実施形態の処理装置10によれば、測定値に基づき測定対象の状態を評価することができる。 According to the processing device 10 of the present embodiment, the state of the measurement target can be evaluated based on the measurement values.
 例えば、身体の左右に対応して2パターンの測定値が算出された場合、2パターンの測定値の差に基づき測定対象の状態を評価することができる。この場合、身体の左右のバランスが適切な状態か否かを評価することができる。 For example, when two patterns of measured values are calculated corresponding to the left and right sides of the body, the state of the measurement target can be evaluated based on the difference between the two patterns of measured values. In this case, it is possible to evaluate whether the right and left balance of the body is in an appropriate state.
 その他、測定値と参照値との比較結果に基づき、測定対象の状態を評価することができる。この場合、測定対象の状態が一般的な標準状態か否かを評価することができる。 In addition, the state of the measurement target can be evaluated based on the comparison result between the measurement value and the reference value. In this case, it is possible to evaluate whether the state of the measurement target is a general standard state.
 その他、同被験者の測定値と過去の測定値との比較結果に基づき、測定対象の状態を評価することができる。この場合、被験者の測定対象の状態の変化(悪化、現状維持、良化等)を評価することができる。 In addition, the state of the measurement target can be evaluated based on the comparison result between the measurement value of the subject and the past measurement value. In this case, the change (deterioration, maintenance of the current state, improvement, etc.) of the state of the measurement target of the subject can be evaluated.
 このように、本実施形態の処理装置10によれば、測定対象の状態を様々な角度から評価することができる。 As described above, according to the processing apparatus 10 of the present embodiment, the state of the measurement target can be evaluated from various angles.
<第4の実施形態>
 本実施形態の処理装置10は、測定対象の状態を示す情報を出力する機能を有する点で、第1乃至第3の実施形態と異なる。以下、詳細に説明する。
<Fourth embodiment>
The processing apparatus 10 of the present embodiment is different from the first to third embodiments in that the processing apparatus 10 has a function of outputting information indicating a state of a measurement target. The details will be described below.
 処理装置10のハードウエア構成の一例は、第1乃至第3の実施形態と同様である。 の 一 An example of the hardware configuration of the processing device 10 is the same as in the first to third embodiments.
 処理装置10の機能ブロック図の一例は、図29で示される。図示するように、処理装置10は、選択部11と、算出部12と、評価部13と、出力部14とを有する。選択部11、算出部12及び評価部13の構成は、第1乃至第3の実施形態と同様である。 An example of a functional block diagram of the processing device 10 is shown in FIG. As illustrated, the processing device 10 includes a selection unit 11, a calculation unit 12, an evaluation unit 13, and an output unit 14. The configurations of the selection unit 11, the calculation unit 12, and the evaluation unit 13 are the same as those of the first to third embodiments.
 出力部14は、出力装置を介して各種情報を出力する。出力装置は、ディスプレイ、投影装置、メーラ、プリンター、スピーカ、警告ランプ等が例示されるがこれらに限定されない。 The output unit 14 outputs various information via the output device. The output device includes, but is not limited to, a display, a projection device, a mailer, a printer, a speaker, a warning lamp, and the like.
 出力部14は、例えば、評価部13が評価した測定対象の状態を示す情報を出力する。そして、出力部14は、測定対象の状態が所定の状態である場合、測定対象の状態を示す情報に加えて、測定対象の状態を改善するための情報又は当該情報を取得するための情報を出力することができる。 The output unit 14 outputs, for example, information indicating the state of the measurement target evaluated by the evaluation unit 13. Then, when the state of the measurement target is the predetermined state, the output unit 14 outputs, in addition to the information indicating the state of the measurement target, information for improving the state of the measurement target or information for acquiring the information. Can be output.
 ここでの「所定の状態」は、改善が必要な状態である。測定対象の状態の定義の仕方が第3の実施形態で例示したものである場合、所定の状態は、例えば「危険」、「やや危険」、「悪化」等である。 「The" predetermined state "here is a state that requires improvement. When the method of defining the state of the measurement target is that exemplified in the third embodiment, the predetermined state is, for example, “danger”, “slightly dangerous”, “deteriorated”, or the like.
 「測定対象の状態を改善するための情報」は、状態を改善するための運動を示す。当該情報は、人物等が当該運動を行っている様子を記録した動画であってもよいし、写真、イラスト及び文字等で当該運動を説明したものであってもよい。「当該情報を取得するための情報」は、当該情報を含むデータファイルの保存場所(例:URL)や、当該情報にアクセスするための案内等である。 "Information for improving the condition of the measurement target" indicates exercise for improving the condition. The information may be a moving image in which a person or the like is performing the exercise, or may be a description of the exercise using photographs, illustrations, characters, and the like. “Information for acquiring the information” is a storage location (eg, URL) of a data file containing the information, a guide for accessing the information, and the like.
 また、出力部14は、算出部12が算出した測定対象の可動域を示す測定値を出力してもよい。 The output unit 14 may output a measurement value indicating the movable range of the measurement target calculated by the calculation unit 12.
 図30に、出力部14が出力する情報の一例を示す。図30では、出力部14は、測定対象毎に、測定対象の可動域を示す測定値(左右両方)と、左右の測定値の差と、評価部13が評価した状態と、を互いに対応付けた情報を出力している。評価部13が評価した状態は、顔のアイコンの表情で示している。そして、出力部14は、評価部13が評価した状態が所定の状態(改善が必要な状態)である測定対象に対応付けて、動画のリンクを埋め込んだアイコン(「Do It!」、「Check」の文字が付されたアイコン)を出力している。 FIG. 30 shows an example of information output from the output unit 14. In FIG. 30, the output unit 14 associates, for each measurement target, the measurement value (both left and right) indicating the movable range of the measurement target, the difference between the left and right measurement values, and the state evaluated by the evaluation unit 13. Output information. The state evaluated by the evaluation unit 13 is indicated by a facial icon expression. Then, the output unit 14 associates the icon (“Do @ It!”, “Check”) with a link of the moving image in association with the measurement target whose evaluation state by the evaluation unit 13 is a predetermined state (a state requiring improvement). ) Icon is output.
 図31に、出力部14が出力する情報の他の一例を示す。図31では、出力部14は、測定対象毎に、評価部13が評価した状態と、左右の測定値の差と、当該差を示す棒グラフと、を互いに対応付けて出力している。評価部13が評価した状態は、顔のアイコンの表情で示している。左右の測定値の差は顔のアイコンの下の数値である。そして、出力部14は、評価部13が評価した状態が所定の状態(改善が必要な状態)である測定対象に対応付けて、動画のリンクを埋め込んだアイコン(「Do It!」、「Check」の文字が付されたアイコン)を出力している。 FIG. 31 shows another example of the information output by the output unit 14. In FIG. 31, the output unit 14 outputs, for each measurement object, the state evaluated by the evaluation unit 13, the difference between the left and right measurement values, and a bar graph indicating the difference in association with each other. The state evaluated by the evaluation unit 13 is indicated by a facial icon expression. The difference between the left and right measurements is the number below the face icon. Then, the output unit 14 associates the icon (“Do @ It!”, “Check”) with a link of the moving image in association with the measurement target whose evaluation state by the evaluation unit 13 is a predetermined state (a state requiring improvement). ) Icon is output.
 図32に、出力部14が出力する情報の他の一例を示す。図32では、評価部13は0乃至3の4段階で測定対象の状態を評価している。3が最も良い状態であり、数値が小さくなるに従い状態が悪くなる。そして、出力部14は、各測定対象の状態及び全体の状態を示したレーダチャートを出力している。 FIG. 32 shows another example of the information output by the output unit 14. In FIG. 32, the evaluation unit 13 evaluates the state of the measurement target in four stages from 0 to 3. 3 is the best state, and the state becomes worse as the numerical value becomes smaller. The output unit 14 outputs a radar chart showing the state of each measurement target and the entire state.
 図33に、出力部14が出力する情報の他の一例を示す。図33では、出力部14は、ある被験者100のある測定対象の測定値の時系列な変化を示す情報を出力している。 FIG. 33 shows another example of the information output by the output unit 14. In FIG. 33, the output unit 14 outputs information indicating a time-series change of a measurement value of a certain measurement object of a certain subject 100.
 また、出力部14は、図34に示すように、被験者100が可動域測定のための所定の動作を行っている最中に、被験者100を撮影した動画をリアルタイムに出力してもよい。そして、出力部14は、当該画面上に各種情報を出力してもよい。 As shown in FIG. 34, the output unit 14 may output a moving image of the subject 100 in real time while the subject 100 is performing a predetermined operation for measuring the range of motion. Then, the output unit 14 may output various information on the screen.
 図34の画面では、(1)の領域に、被験者100を撮影した動画がリアルタイムに表示されている。そして、(2)の領域に、選択部11が選択した測定対象を示す情報が表示されている。 In the screen of FIG. 34, a moving image of the subject 100 is displayed in real time in the area (1). In the area (2), information indicating the measurement target selected by the selection unit 11 is displayed.
 また、(3)の領域に、その時に(1)の領域に表示されているフレーム又はその直前のフレームに基づき算出部12が算出したその時の可動域を示す測定値が表示されている。また、(3)の領域には、測定中に各フレームに基づき算出された測定値の中の最大値が表示されている。この場合、算出部12は画像を解析し、出力部14はその解析結果(測定値)をリアルタイムに出力する。 {Circle around (3)}, the measurement value indicating the movable range at that time calculated by the calculation unit 12 based on the frame displayed in the region (1) at that time or the frame immediately before that is displayed in the region (3). In the area (3), the maximum value among the measured values calculated based on each frame during the measurement is displayed. In this case, the calculation unit 12 analyzes the image, and the output unit 14 outputs the analysis result (measured value) in real time.
 図34の画面では、その他、再測定を行うためのボタンや、その時に(3)の領域に表示されている測定値で(3)の領域に表示される最大値を強制的に更新するためのボタン(星が付されたボタン)等が表示されている。 On the screen of FIG. 34, there are other buttons for re-measurement, and for forcibly updating the maximum value displayed in the area (3) with the measured value displayed in the area (3) at that time. (Buttons with stars) are displayed.
 以上説明した本実施形態の処理装置10によれば、第1乃至第3の実施形態と同様な作用効果を実現できる。 According to the processing apparatus 10 of the present embodiment described above, the same functions and effects as those of the first to third embodiments can be realized.
 また、本実施形態の処理装置10によれば、測定対象の可動域を示す測定値、左右の測定値の差、測定対象の状態等の各種情報を測定対象毎に出力することができる。被験者100は、当該情報を閲覧することで、自身の身体の部位の状態を容易に把握することができる。 According to the processing device 10 of the present embodiment, various information such as a measured value indicating the movable range of the measurement target, a difference between left and right measurement values, and a state of the measurement target can be output for each measurement target. The subject 100 can easily grasp the state of the body part of the subject by browsing the information.
 また、本実施形態の処理装置10によれば、状態がよくない測定対象が存在する場合、その測定対象の状態を改善するための情報やその情報を取得するための情報を出力することができる。このように、処理装置10は、被験者100が欲する情報を適切なタイミングで提供することができる。 Further, according to the processing device 10 of the present embodiment, when there is a measurement target whose state is not good, information for improving the state of the measurement target and information for acquiring the information can be output. . As described above, the processing device 10 can provide information desired by the subject 100 at an appropriate timing.
<第5の実施形態>
 本実施形態の処理装置10は、測定対象の可動域を測定するための所定の動作を行っている最中に被験者が当該測定の精度に影響する禁止動作を行ったことを検出する機能を有する点で、第1乃至第4の実施形態と異なる。以下、詳細に説明する。
<Fifth embodiment>
The processing device 10 of the present embodiment has a function of detecting that the subject has performed a prohibition operation that affects the accuracy of the measurement while performing a predetermined operation for measuring the movable range of the measurement target. This is different from the first to fourth embodiments. The details will be described below.
 処理装置10のハードウエア構成の一例は、第1乃至第4の実施形態と同様である。 の 一 An example of the hardware configuration of the processing device 10 is the same as in the first to fourth embodiments.
 処理装置10の機能ブロック図の一例は、図35で示される。図示するように、処理装置10は、選択部11と、算出部12と、評価部13と、出力部14と、禁止動作検出部15とを有する。選択部11、算出部12、評価部13及び出力部14の構成は、第1乃至第4の実施形態と同様である。 An example of a functional block diagram of the processing device 10 is shown in FIG. As illustrated, the processing device 10 includes a selection unit 11, a calculation unit 12, an evaluation unit 13, an output unit 14, and a prohibited operation detection unit 15. The configurations of the selection unit 11, the calculation unit 12, the evaluation unit 13, and the output unit 14 are the same as those of the first to fourth embodiments.
 禁止動作検出部15は、被験者100を撮影した画像を解析し、被験者が禁止動作を行ったことを検出する。禁止動作検出部15は、選択部11が選択した測定対象に関連した被験者の身体の複数箇所を画像上で特定し、特定した複数箇所に基づき禁止動作を検出する。例えば、禁止動作検出部15は、身体の第1の箇所と第2の箇所とを結ぶ線と基準線とのなす角が基準範囲から逸脱した場合、禁止動作を行ったと判断してもよい。その他、禁止動作検出部15は、身体の第3の箇所と第4の箇所との距離が基準範囲から逸脱した場合、禁止動作を行ったと判断してもよい。 (4) The prohibited operation detection unit 15 analyzes an image of the subject 100 and detects that the subject has performed a prohibited operation. The prohibited operation detecting unit 15 specifies a plurality of locations of the subject's body related to the measurement target selected by the selecting unit 11 on the image, and detects the prohibited operation based on the specified plurality of locations. For example, the prohibition operation detection unit 15 may determine that the prohibition operation has been performed when the angle between the line connecting the first and second parts of the body and the reference line deviates from the reference range. In addition, the prohibited operation detection unit 15 may determine that the prohibited operation has been performed when the distance between the third location and the fourth location of the body deviates from the reference range.
 なお、図36に示すように、予め、測定対象毎に特定する身体の箇所を示した情報が処理装置10に登録されていてもよい。そして、禁止動作検出部15は、当該情報と、選択部11により選択された測定対象とに基づき、特定する身体の箇所を把握してもよい。また、禁止動作検出部15は、図36に示す情報に基づき、検出する内容を把握してもよい。 As shown in FIG. 36, information indicating the location of the body specified for each measurement target may be registered in the processing device 10 in advance. Then, the prohibited motion detection unit 15 may grasp the location of the body to be specified based on the information and the measurement target selected by the selection unit 11. Further, the prohibited operation detection unit 15 may grasp the content to be detected based on the information shown in FIG.
 なお、第2の実施形態で説明した通り、算出部12も、選択部11が選択した測定対象に関連した被験者の身体の複数箇所を画像上で特定し、特定した複数箇所に基づき所定の処理を行う。しかし、算出部12が特定する身体の箇所と、禁止動作検出部15が特定する身体の箇所とは互いに異なり得る(完全には一致しない状態)。すなわち、算出部12が第1の測定対象に関連して特定する身体の複数箇所と、禁止動作検出部15が第1の測定対象に関連して特定する身体の複数箇所とは異なり得る(完全には一致しない状態)。 As described in the second embodiment, the calculation unit 12 also specifies a plurality of locations of the subject's body related to the measurement target selected by the selection unit 11 on the image, and performs a predetermined process based on the identified plurality of locations. I do. However, the body part specified by the calculation unit 12 and the body part specified by the prohibited motion detection unit 15 may be different from each other (a state in which they do not completely match). That is, the plurality of parts of the body specified by the calculation unit 12 in relation to the first measurement target may be different from the plurality of parts of the body specified by the prohibited operation detection unit 15 in relation to the first measurement target (completely). Does not match).
 ここで、各測定対象の禁止動作、及び、禁止動作検出部15が検出する内容の一例を説明する。 Here, an example of the prohibited operation of each measurement target and the contents detected by the prohibited operation detection unit 15 will be described.
「肩関節屈曲」
 肩関節屈曲では、図37に示すように、(1)体幹を前後に倒す動作、(2)動作している腕の肩を後ろ側に引く動作、(3)動作している腕が身体から外側に離れる動作等が、禁止動作となる。ここで、各禁止動作を画像解析で検出する方法の一例を説明するが、これに限定されない。
"Shoulder flexion"
As shown in FIG. 37, in the shoulder joint flexion, (1) the operation of tilting the trunk forward and backward, (2) the operation of pulling the shoulder of the operating arm backward, and (3) the operating arm An operation or the like moving away from the outside is a prohibited operation. Here, an example of a method for detecting each prohibited operation by image analysis will be described, but the present invention is not limited to this.
 (1)の禁止動作:例えば、禁止動作検出部15は、頭と左右の肩の真ん中を結ぶ線や、左右の肩の真ん中と骨盤真ん中とを結ぶ線等が、YZ平面内でY軸から所定角度以上傾くことを検出することで、(1)の禁止動作を検出してもよい。 (1) Prohibition operation: For example, the prohibition operation detection unit 15 detects a line connecting the head and the center of the left and right shoulders, a line connecting the center of the left and right shoulders and the center of the pelvis, etc. from the Y axis in the YZ plane. The prohibition operation of (1) may be detected by detecting the inclination of a predetermined angle or more.
 (2)の禁止動作:例えば、禁止動作検出部15は、両肩を結ぶ線が、XZ平面内でX軸から所定角度以上傾くことを検出することで、(2)の禁止動作を検出してもよい。 Prohibition operation of (2): For example, the prohibition operation detection unit 15 detects the prohibition operation of (2) by detecting that the line connecting both shoulders is inclined at a predetermined angle or more from the X axis in the XZ plane. You may.
 (3)の禁止動作:例えば、禁止動作検出部15は、動作している腕の肩と肘を結ぶ線が、XY平面内でY軸から所定角度以上傾くことを検出することで、(3)の禁止動作を検出してもよい。 Prohibition operation of (3): For example, the prohibition operation detection unit 15 detects that the line connecting the shoulder and the elbow of the operating arm is tilted by a predetermined angle or more from the Y axis in the XY plane, thereby obtaining (3). ) May be detected.
「肩関節外旋、肩関節内旋」
 肩関節外旋及び肩関節内旋では、図38に示すように、(1)体幹を前後に倒す動作、(2)動作している腕の肩を後ろ側に引く動作、(3)動作している腕が身体から外側に離れる動作、(4)肘の90°が解除される動作等が、禁止動作となる。ここで、各禁止動作を画像解析で検出する方法の一例を説明するが、これに限定されない。
`` Shoulder external rotation, shoulder internal rotation ''
In the external rotation of the shoulder joint and the internal rotation of the shoulder joint, as shown in FIG. 38, (1) the operation of tilting the trunk forward and backward, (2) the operation of pulling the shoulder of the operating arm backward, and (3) the operation An operation in which the wrist arm moves away from the body, (4) an operation in which the elbow is released from 90 °, and the like are prohibited operations. Here, an example of a method for detecting each prohibited operation by image analysis will be described, but the present invention is not limited to this.
 (1)の禁止動作:例えば、禁止動作検出部15は、頭と左右の肩の真ん中を結ぶ線や、左右の肩の真ん中と骨盤真ん中とを結ぶ線等が、YZ平面内でY軸から所定角度以上傾くことを検出することで、(1)の禁止動作を検出してもよい。 (1) Prohibition operation: For example, the prohibition operation detection unit 15 detects a line connecting the head and the center of the left and right shoulders, a line connecting the center of the left and right shoulders and the center of the pelvis, etc. from the Y axis in the YZ plane. The prohibition operation of (1) may be detected by detecting the inclination of a predetermined angle or more.
 (2)の禁止動作:例えば、禁止動作検出部15は、両肩を結ぶ線が、XZ平面内でX軸から所定角度以上傾くことを検出することで、(2)の禁止動作を検出してもよい。 Prohibition operation of (2): For example, the prohibition operation detection unit 15 detects the prohibition operation of (2) by detecting that the line connecting both shoulders is inclined at a predetermined angle or more from the X axis in the XZ plane. You may.
 (3)の禁止動作:例えば、禁止動作検出部15は、動作している腕の肘と手先を結ぶ線が、XY平面内でY軸から所定角度以上傾くことを検出することで、(3)の禁止動作を検出してもよい。 Prohibition operation of (3): For example, the prohibition operation detection unit 15 detects that the line connecting the elbow and the hand of the operating arm is inclined at a predetermined angle or more from the Y axis in the XY plane, thereby obtaining (3). ) May be detected.
 (4)の禁止動作:例えば、禁止動作検出部15は、動作している腕の肩と肘を結ぶ線をXY平面に投影した線と、肘と手先を結ぶ線をXY平面に投影した線とのなす角が90°マイナスβから90°プラスβの範囲から逸脱することを検出することで、(4)の禁止動作を検出してもよい。 (4) Prohibition operation: For example, the prohibition operation detection unit 15 projects the line connecting the shoulder and the elbow of the operating arm to the XY plane and the line connecting the elbow and the hand to the XY plane. By detecting that the angle formed by the angle .theta. Deviates from the range of 90.degree. Minus .beta. To 90.degree. Plus .beta., The inhibition operation of (4) may be detected.
「股関節屈曲」
 股関節屈曲では、図39に示すように、(1)体幹を前後に倒す動作等が禁止動作となる。ここで、当該禁止動作を画像解析で検出する方法の一例を説明するが、これに限定されない。
`` Hip joint flexion ''
In the hip flexion, as shown in FIG. 39, (1) an operation of tilting the trunk back and forth and the like are prohibited operations. Here, an example of a method of detecting the prohibited operation by image analysis will be described, but the method is not limited to this.
 (1)の禁止動作:例えば、禁止動作検出部15は、頭と左右の肩の真ん中を結ぶ線や、左右の肩の真ん中と左右の股関節の真ん中とを結ぶ線等が、YZ平面内でY軸から所定角度以上傾くことを検出することで、(1)の禁止動作を検出してもよい。 (1) Prohibition operation: For example, the prohibition operation detection unit 15 detects a line connecting the head and the center of the left and right shoulders, a line connecting the center of the left and right shoulders and the center of the left and right hip joints, etc. in the YZ plane. The prohibition operation of (1) may be detected by detecting the inclination from the Y axis by a predetermined angle or more.
「体幹部回旋」
 体幹部回旋では、図40に示すように、(1)両肩の一方が下がってしまい、両肩を結ぶ線が水平になっていない動作等が禁止動作となる。ここで、当該禁止動作を画像解析で検出する方法の一例を説明するが、これに限定されない。
`` Torso rotation ''
In the torso rotation, as shown in FIG. 40, (1) one of the shoulders is lowered, and an operation in which a line connecting both shoulders is not horizontal is a prohibited operation. Here, an example of a method of detecting the prohibited operation by image analysis will be described, but the method is not limited to this.
 (1)の禁止動作:例えば、禁止動作検出部15は、両肩を結ぶ線が、XY平面内でX軸から所定角度以上傾くことを検出することで、(1)の禁止動作を検出してもよい。 Prohibition operation of (1): For example, the prohibition operation detection unit 15 detects the prohibition operation of (1) by detecting that the line connecting both shoulders is inclined at a predetermined angle or more from the X axis in the XY plane. You may.
「下肢リーチ側方」
 下肢リーチ側方では、図41に示すように、(1)両足先間のX軸方向の距離をLとした場合、骨盤真ん中の位置が支持側の足先からX軸方向にL/3以上離れる動作等が禁止動作となる。ここで、当該禁止動作を画像解析で検出する方法の一例を説明するが、これに限定されない。
"Legs reach lateral"
As shown in FIG. 41, when the distance in the X-axis direction between both toes is L, the position of the center of the pelvis is L / 3 or more in the X-axis direction from the toes on the support side, as shown in FIG. A separating operation or the like is a prohibited operation. Here, an example of a method of detecting the prohibited operation by image analysis will be described, but the method is not limited to this.
 (1)の禁止動作:例えば、禁止動作検出部15は、両足先間の距離Lを算出し、また、骨盤真ん中と支持側の足先のX軸方向の距離Mを算出した後、M≧L/3を満たすことを検出することで、(1)の禁止動作を検出してもよい。 (1) Prohibition operation: For example, the prohibition operation detection unit 15 calculates the distance L between both toes, calculates the distance M in the X-axis direction between the center of the pelvis and the toe on the support side, and then M ≧ The prohibition operation (1) may be detected by detecting that L / 3 is satisfied.
「下肢リーチ後方」
 下肢リーチ側方では、図42に示すように、(1)両足先間のX軸方向の距離をLとした場合、骨盤真ん中の位置が支持側の足先からX軸方向にL/3以上離れる動作等が禁止動作となる。ここで、当該禁止動作を画像解析で検出する方法の一例を説明するが、これに限定されない。
"Back of lower limb reach"
As shown in FIG. 42, on the lower limb reach side, (1) when the distance in the X-axis direction between both toes is L, the position of the center of the pelvis is L / 3 or more in the X-axis direction from the toe on the support side. A separating operation or the like is a prohibited operation. Here, an example of a method of detecting the prohibited operation by image analysis will be described, but the method is not limited to this.
 (1)の禁止動作:例えば、禁止動作検出部15は、両足先間の距離Lを算出し、また、骨盤真ん中と支持側の足先のX軸方向の距離Mを算出した後、M≧L/3を満たすことを検出することで、(1)の禁止動作を検出してもよい。 (1) Prohibition operation: For example, the prohibition operation detection unit 15 calculates the distance L between both toes, calculates the distance M in the X-axis direction between the center of the pelvis and the toe on the support side, and then M ≧ The prohibition operation (1) may be detected by detecting that L / 3 is satisfied.
「T字バランス」
 T字バランスでは、(1)肩が肘より3次元カメラ200側にある、また、膝が股関節より3次元カメラ200側にある動作等が禁止動作となる。ここで、当該禁止動作を画像解析で検出する方法の一例を説明するが、これに限定されない。
"T-shaped balance"
In the T-shaped balance, (1) an operation in which the shoulder is on the three-dimensional camera 200 side from the elbow and a knee is on the three-dimensional camera 200 side from the hip joint is a prohibited operation. Here, an example of a method of detecting the prohibited operation by image analysis will be described, but the method is not limited to this.
 (1)の禁止動作:例えば、禁止動作検出部15は、肩と肘とを結ぶ線がXZ平面内でX軸から所定角度以上傾くことや、膝と股関節とを結ぶ線がXZ平面内でX軸から所定角度以上傾くことを検出することで、(1)の禁止動作を検出してもよい。 Prohibition operation (1): For example, the prohibition operation detection unit 15 determines that the line connecting the shoulder and the elbow is inclined at a predetermined angle or more from the X axis in the XZ plane, or the line connecting the knee and the hip joint is in the XZ plane. The prohibition operation of (1) may be detected by detecting the inclination from the X axis by a predetermined angle or more.
 なお、出力部14は、禁止動作検出部15が禁止動作を検出した場合、その旨を示す警告を出力してもよい。 When the prohibited operation detecting unit 15 detects the prohibited operation, the output unit 14 may output a warning to that effect.
 例えば、出力部14が、図34に示すように、被験者100が可動域測定のための所定の動作を行っている最中に、被験者100を撮影した動画をリアルタイムに出力する場合、禁止動作検出部15はリアルタイムに画像を解析し、禁止動作を検出してもよい。そして、出力部14は、禁止動作検出部15が禁止動作を検出すると、その旨を示す警告をリアルタイムに図34に示す画面に表示してもよい。出力部14は、その他、スピーカや警告連プ等を介して、警告を出力してもよい。 For example, as shown in FIG. 34, when the output unit 14 outputs a moving image of the subject 100 in real time while the subject 100 is performing a predetermined operation for measuring the range of motion, a prohibited operation is detected. The unit 15 may analyze the image in real time and detect the prohibited operation. Then, when the prohibited operation detecting unit 15 detects the prohibited operation, the output unit 14 may display a warning to that effect on the screen shown in FIG. 34 in real time. The output unit 14 may output a warning through a speaker, a warning lamp, or the like.
 また、算出部12は、可動域を示す測定値の算出に、禁止動作検出部15の検出結果を利用してもよい。第2の実施形態で説明した通り、算出部12は、被験者100が所定の一連の動作を行う様子を示す動画像を解析し、フレーム毎になす角や距離を算出し、フレーム毎に算出した複数のなす角や距離の中の最大値を、可動域を示す測定値として算出することができる。この場合、算出部12は、禁止動作が検出されていない時のフレーム各々から算出した複数のなす角や距離の中の最大値を、可動域を示す測定値として算出してもよい。すなわち、算出部12は、禁止動作が検出されている時のフレームに基づき算出されたなす角や距離は無視し、禁止動作が検出されていない時のフレームに基づき算出されたなす角や距離に基づき、可動域を示す測定値を算出してもよい。 The calculation unit 12 may use the detection result of the prohibited operation detection unit 15 to calculate the measurement value indicating the movable range. As described in the second embodiment, the calculation unit 12 analyzes a moving image showing a state in which the subject 100 performs a predetermined series of operations, calculates an angle and a distance made for each frame, and calculates the angle and the distance for each frame. The maximum value among a plurality of angles and distances can be calculated as a measurement value indicating the movable range. In this case, the calculation unit 12 may calculate a maximum value among a plurality of angles and distances calculated from each frame when no prohibited operation is detected as a measurement value indicating the movable range. That is, the calculation unit 12 ignores the angle and distance calculated based on the frame when the prohibited operation is detected, and ignores the angle and distance calculated based on the frame when the prohibited operation is not detected. Based on this, a measurement value indicating the movable range may be calculated.
 以上説明した本実施形態の処理装置10によれば、第1乃至第4の実施形態と同様な作用効果を実現できる。 According to the processing apparatus 10 of the present embodiment described above, the same functions and effects as those of the first to fourth embodiments can be realized.
 また、本実施形態の処理装置10によれば、測定対象の可動域を測定するための所定の動作を行っている最中に被験者が当該測定の精度に影響する禁止動作を行ったことを検出できる。このような本実施形態の処理装置10によれば、可動域の測定結果の信頼度が高まる。 Further, according to the processing device 10 of the present embodiment, it is detected that the subject has performed the prohibition operation that affects the accuracy of the measurement while performing the predetermined operation for measuring the movable range of the measurement target. it can. According to the processing apparatus 10 of the present embodiment, the reliability of the measurement result of the movable range is increased.
<第6の実施形態>
 本実施形態の処理装置10は、反動をつけて動作を行うことで可動域の測定値が大きくなる問題を軽減する機能を有する点で、第1乃至第5の実施形態と異なる。以下、詳細に説明する。
<Sixth embodiment>
The processing apparatus 10 of the present embodiment is different from the first to fifth embodiments in that the processing apparatus 10 has a function of reducing the problem that the measured value of the movable range increases by performing an operation with a recoil. The details will be described below.
 処理装置10のハードウエア構成の一例は、第1乃至第5の実施形態と同様である。 の 一 An example of the hardware configuration of the processing apparatus 10 is the same as in the first to fifth embodiments.
 処理装置10の機能ブロック図の一例は、図3、図27、図29又は図35で示される。図示するように、処理装置10は、選択部11と、算出部12とを有し、さらに、評価部13、出力部14及び禁止動作検出部15の中の少なくとも1つを備えることができる。選択部11、評価部13、出力部14及び禁止動作検出部15の構成は、第1乃至第5の実施形態と同様である。 An example of a functional block diagram of the processing device 10 is shown in FIG. 3, FIG. 27, FIG. 29, or FIG. As illustrated, the processing device 10 includes a selection unit 11 and a calculation unit 12, and can further include at least one of an evaluation unit 13, an output unit 14, and a prohibited operation detection unit 15. The configurations of the selection unit 11, the evaluation unit 13, the output unit 14, and the prohibited operation detection unit 15 are the same as those in the first to fifth embodiments.
 算出部12は、第2の実施形態で説明した通り、被験者100が所定の一連の動作を行う様子を示す動画像を解析し、フレーム毎に測定対象の可動域を示す値(なす角や距離)を算出する。そして、算出部12は、ある瞬間の可動域を示す値として、その瞬間のフレーム及びその直前M個のフレーム各々に基づき算出した測定対象の可動域を示す値(なす角や距離)の統計値(例:平均値、最頻値、中央値等)を算出する。このようにして、算出部12は、動画像に含まれる複数の瞬間各々に対応して、その瞬間のフレーム及びその直前M個のフレームに基づき可動域を示す値を算出する。次いで、算出部12は、複数の瞬間各々に対応して算出した可動域を示す値の中の最大値を、測定対象の可動域を示す測定値として算出する。 As described in the second embodiment, the calculation unit 12 analyzes a moving image showing a state in which the subject 100 performs a predetermined series of operations, and calculates a value (an angle or a distance) indicating a movable range of a measurement target for each frame. ) Is calculated. Then, the calculating unit 12 calculates, as a value indicating the movable range at a certain moment, a statistical value of a value (an angle or a distance) indicating the movable range of the measurement target calculated based on the frame at the moment and the M frames immediately before the frame. (Example: average value, mode value, median value, etc.) are calculated. In this way, the calculating unit 12 calculates the value indicating the movable range based on the frame at the moment and the M frames immediately before the moment at each of the plurality of moments included in the moving image. Next, the calculation unit 12 calculates the maximum value among the values indicating the movable range calculated corresponding to each of the plurality of instants as a measurement value indicating the movable range of the measurement target.
 なお、上記処理の「その瞬間のフレーム及びその直前M個のフレーム」を、「その瞬間のフレーム及びその直後M個のフレーム」又は「その瞬間のフレーム及びその直前直後M個のフレーム」に代えてもよい。 It should be noted that “the frame at the moment and the M frames immediately before the instant” in the above processing are replaced with “the frame at the instant and the M frames immediately after the instant” or “the frame at the instant and the M frames immediately before and after the instant”. You may.
 算出部12のその他の構成は、第1乃至第5の実施形態と同様である。 Other configurations of the calculation unit 12 are the same as those of the first to fifth embodiments.
 以上説明した本実施形態の処理装置10によれば、第1乃至第5の実施形態と同様な作用効果を実現できる。 According to the processing apparatus 10 of the present embodiment described above, the same functions and effects as those of the first to fifth embodiments can be realized.
 また、本実施形態の処理装置10によれば、反動をつけて動作を行うことで可動域の測定値が大きくなる問題を軽減することができる。反動をつけて第2の実施形態で説明したような動作を行った場合、瞬間的に算出する値(なす角や距離)が大きくなり得るが、その状態は持続しない。このため、連続する複数のフレーム各々から算出した値の統計値を測定値とすることで、瞬間的に大きくなった値を無視又はその影響を小さくすることができる。 According to the processing apparatus 10 of the present embodiment, the problem that the measured value of the movable range is increased by performing the operation with the recoil can be reduced. When the operation described in the second embodiment is performed with a reaction, the values (angles and distances) to be calculated instantaneously may be large, but the state is not maintained. For this reason, by using the statistical value of the value calculated from each of a plurality of continuous frames as the measured value, it is possible to ignore the value that has increased instantaneously or reduce its influence.
 以下、参考形態の例を付記する。
1. 可動域の測定対象を選択する選択手段と、
 被験者を撮影した画像を、選択された前記測定対象に対応した解析方法で解析することで、選択された前記測定対象の可動域を示す測定値を算出する算出手段と、
を有する処理装置。
2. 1に記載の処理装置において、
 前記算出手段は、
  前記被験者を前記画像上で特定し、
  選択された前記測定対象に関連した箇所を前記画像上で特定し、
  特定した前記箇所に基づき前記測定値を算出する処理装置。
3. 2に記載の処理装置において、
 前記算出手段は、
  特定した第1の箇所と第2の箇所とを結ぶ線と、基準線とのなす角を、前記測定値として算出する処理装置。
4. 2又は3に記載の処理装置において、
 前記算出手段は、
  特定した第3の箇所と第4の箇所との距離を、前記測定値として算出する処理装置。
5. 2から4のいずれかに記載の処理装置において、
 前記算出手段は、
  前記被験者が第1の姿勢をとっている時に特定した第5の箇所と、前記被験者が前記第1の姿勢と異なる第2の姿勢をとっている時に特定した前記第5の箇所との距離を、前記測定値として算出する処理装置。
6. 1から5のいずれかに記載の処理装置において、
 前記算出手段は、
  動画像を解析して前記測定値を算出し、
  選択された前記測定対象の可動域を示す値を複数のフレーム画像各々に基づき算出し、算出した複数の前記値の統計値を前記測定値として算出する処理装置。
7. 1から6のいずれかに記載の処理装置において、
 前記測定値に基づき、前記測定対象の状態を評価する評価手段をさらに有する処理装置。
8. 7に記載の処理装置において、
 前記測定対象の前記測定値が、身体の左右に対応して2パターン算出された場合、
 前記評価手段は、2パターンの前記測定値の差に基づき、前記測定対象の状態を評価する処理装置。
9. 7又は8に記載の処理装置において、
 前記評価手段は、前記測定値と参照値との比較結果に基づき、前記測定対象の状態を評価する処理装置。
10. 7から9のいずれかに記載の処理装置において、
 前記評価手段は、同じ前記被験者の前記測定値と過去の前記測定値に基づき、前記測定対象の状態を評価する処理装置。
11. 7から10のいずれかに記載の処理装置において、
 前記評価手段が評価した前記測定対象の状態を示す情報を出力する出力手段をさらに有する処理装置。
12. 11に記載の処理装置において、
 前記出力手段は、前記測定対象の状態が所定の状態である場合、前記測定対象の状態を改善するための情報又は前記情報を取得するための情報を出力する処理装置。
13. 1から12のいずれかに記載の処理装置において、
 前記画像を解析し、前記被験者が禁止動作を行ったことを検出する禁止動作検出手段をさらに有する処理装置。
14. 13に記載の処理装置において、
 前記測定対象毎に前記禁止動作が予め登録されており、
 前記禁止動作検出手段は、選択された前記測定対象に関連した前記禁止動作を検出する処理装置。
15. 13又は14に記載の処理装置において、
 前記算出手段は、選択された前記測定対象に対応した複数箇所を前記画像上で特定し、特定した前記複数箇所に基づき前記測定値を算出し、
 前記禁止動作検出手段は、選択された前記測定対象に対応した複数箇所を前記画像上で特定し、特定した前記複数箇所に基づき前記禁止動作を検出し、
 前記算出手段が第1の前記測定対象に対応して特定する複数箇所と、前記禁止動作検出手段が前記第1の測定対象に対応して特定する複数箇所とは異なる処理装置。
16. コンピュータが、
 可動域の測定対象を選択する選択工程と、
 被験者を撮影した画像を、選択された前記測定対象に対応した解析方法で解析することで、選択された前記測定対象の可動域を示す測定値を算出する算出工程と、
を実行する処理方法。
17. コンピュータを、
 可動域の測定対象を選択する選択手段、
 被験者を撮影した画像を、選択された前記測定対象に対応した解析方法で解析することで、選択された前記測定対象の可動域を示す測定値を算出する算出手段、
として機能させるプログラム。
Hereinafter, examples of the reference embodiment will be additionally described.
1. Selecting means for selecting a measurement object of the range of motion,
A calculation unit that calculates a measurement value indicating a movable range of the selected measurement target by analyzing an image obtained by capturing the subject by an analysis method corresponding to the selected measurement target,
A processing device having:
2. In the processing device according to 1,
The calculating means,
Identifying the subject on the image;
Identifying a location related to the selected measurement target on the image,
A processing device that calculates the measurement value based on the specified location.
3. 2. The processing apparatus according to item 2,
The calculating means,
A processing device for calculating, as the measured value, an angle between a line connecting the identified first location and the second location and a reference line.
4. In the processing device according to 2 or 3,
The calculating means,
A processing device that calculates a distance between the specified third location and the fourth location as the measured value.
5. In the processing apparatus according to any one of 2 to 4,
The calculating means,
The distance between the fifth position specified when the subject is in the first position and the fifth position specified when the subject is in the second position different from the first position is , A processing device for calculating as the measured value.
6. In the processing apparatus according to any one of 1 to 5,
The calculating means,
Analyzing the moving image to calculate the measurement value,
A processing device that calculates a value indicating the selected movable range of the measurement target based on each of a plurality of frame images, and calculates a statistical value of the calculated plurality of values as the measurement value.
7. 7. The processing apparatus according to any one of 1 to 6,
A processing apparatus further comprising an evaluation unit that evaluates a state of the measurement target based on the measurement value.
8. 7. The processing apparatus according to 7, wherein
When the measurement value of the measurement target is calculated in two patterns corresponding to the left and right of the body,
A processing device that evaluates a state of the measurement target based on a difference between the measured values of the two patterns.
9. In the processing apparatus according to 7 or 8,
A processing device that evaluates a state of the measurement target based on a comparison result between the measurement value and a reference value.
10. In the processing apparatus according to any one of 7 to 9,
A processing device that evaluates a state of the measurement target based on the measurement value of the same subject and the past measurement value.
11. In the processing apparatus according to any one of 7 to 10,
A processing device further comprising an output unit that outputs information indicating a state of the measurement target evaluated by the evaluation unit.
12. 11. The processing apparatus according to item 11,
A processing device that, when the state of the measurement target is a predetermined state, outputs information for improving the state of the measurement target or information for acquiring the information.
13. 13. The processing apparatus according to any one of 1 to 12,
A processing device further comprising a prohibited operation detecting unit that analyzes the image and detects that the subject has performed a prohibited operation.
14. 13. The processing apparatus according to item 13,
The prohibited operation is registered in advance for each of the measurement targets,
The processing device for detecting the prohibited operation related to the selected measurement target.
15. In the processing apparatus according to 13 or 14,
The calculating means specifies a plurality of locations corresponding to the selected measurement target on the image, calculates the measurement value based on the identified plurality of locations,
The prohibited operation detecting means specifies a plurality of locations corresponding to the selected measurement target on the image, and detects the prohibited operation based on the specified plurality of locations,
A processing device different from a plurality of locations specified by the calculation means corresponding to the first measurement target and a plurality of locations specified by the prohibited operation detection means corresponding to the first measurement target.
16. Computer
A selection step of selecting a measurement object of the range of motion,
A calculation step of calculating a measurement value indicating a movable range of the selected measurement target, by analyzing an image obtained by capturing the subject by an analysis method corresponding to the selected measurement target,
Processing method to execute.
17. Computer
Selection means for selecting the measurement object of the range of motion,
Calculation means for calculating a measurement value indicating a movable range of the selected measurement target by analyzing an image obtained by capturing the subject by an analysis method corresponding to the selected measurement target,
Program to function as.
 この出願は、2018年7月24日に出願された日本出願特願2018-138529号を基礎とする優先権を主張し、その開示の全てをここに取り込む。 This application claims the priority based on Japanese Patent Application No. 2018-138529 filed on July 24, 2018, the entire disclosure of which is incorporated herein.

Claims (17)

  1.  可動域の測定対象を選択する選択手段と、
     被験者を撮影した画像を、選択された前記測定対象に対応した解析方法で解析することで、選択された前記測定対象の可動域を示す測定値を算出する算出手段と、
    を有する処理装置。
    Selecting means for selecting a measurement object of the range of motion,
    A calculation unit that calculates a measurement value indicating a movable range of the selected measurement target by analyzing an image of the subject, by an analysis method corresponding to the selected measurement target,
    A processing device having:
  2.  請求項1に記載の処理装置において、
     前記算出手段は、
      前記被験者を前記画像上で特定し、
      選択された前記測定対象に関連した箇所を前記画像上で特定し、
      特定した前記箇所に基づき前記測定値を算出する処理装置。
    The processing device according to claim 1,
    The calculating means,
    Identifying the subject on the image;
    Identifying a location related to the selected measurement target on the image,
    A processing device that calculates the measurement value based on the specified location.
  3.  請求項2に記載の処理装置において、
     前記算出手段は、
      特定した第1の箇所と第2の箇所とを結ぶ線と、基準線とのなす角を、前記測定値として算出する処理装置。
    The processing device according to claim 2,
    The calculating means,
    A processing device for calculating, as the measured value, an angle between a line connecting the identified first location and the second location and a reference line.
  4.  請求項2又は3に記載の処理装置において、
     前記算出手段は、
      特定した第3の箇所と第4の箇所との距離を、前記測定値として算出する処理装置。
    The processing apparatus according to claim 2 or 3,
    The calculating means,
    A processing device that calculates a distance between the specified third location and the fourth location as the measured value.
  5.  請求項2から4のいずれか1項に記載の処理装置において、
     前記算出手段は、
      前記被験者が第1の姿勢をとっている時に特定した第5の箇所と、前記被験者が前記第1の姿勢と異なる第2の姿勢をとっている時に特定した前記第5の箇所との距離を、前記測定値として算出する処理装置。
    The processing apparatus according to any one of claims 2 to 4,
    The calculating means,
    The distance between the fifth position specified when the subject is in the first position and the fifth position specified when the subject is in the second position different from the first position is , A processing device for calculating as the measured value.
  6.  請求項1から5のいずれか1項に記載の処理装置において、
     前記算出手段は、
      動画像を解析して前記測定値を算出し、
      選択された前記測定対象の可動域を示す値を複数のフレーム画像各々に基づき算出し、算出した複数の前記値の統計値を前記測定値として算出する処理装置。
    The processing device according to any one of claims 1 to 5,
    The calculating means,
    Analyzing the moving image to calculate the measurement value,
    A processing device that calculates a value indicating the selected movable range of the measurement target based on each of a plurality of frame images, and calculates a statistical value of the calculated plurality of values as the measurement value.
  7.  請求項1から6のいずれか1項に記載の処理装置において、
     前記測定値に基づき、前記測定対象の状態を評価する評価手段をさらに有する処理装置。
    The processing apparatus according to any one of claims 1 to 6,
    A processing apparatus further comprising an evaluation unit that evaluates a state of the measurement target based on the measurement value.
  8.  請求項7に記載の処理装置において、
     前記測定対象の前記測定値が、身体の左右に対応して2パターン算出された場合、
     前記評価手段は、2パターンの前記測定値の差に基づき、前記測定対象の状態を評価する処理装置。
    The processing device according to claim 7,
    When the measurement value of the measurement target is calculated in two patterns corresponding to the left and right of the body,
    A processing device that evaluates a state of the measurement target based on a difference between the measured values of the two patterns.
  9.  請求項7又は8に記載の処理装置において、
     前記評価手段は、前記測定値と参照値との比較結果に基づき、前記測定対象の状態を評価する処理装置。
    The processing device according to claim 7 or 8,
    A processing device that evaluates a state of the measurement target based on a comparison result between the measurement value and a reference value.
  10.  請求項7から9のいずれか1項に記載の処理装置において、
     前記評価手段は、同じ前記被験者の前記測定値と過去の前記測定値に基づき、前記測定対象の状態を評価する処理装置。
    The processing device according to any one of claims 7 to 9,
    A processing device that evaluates a state of the measurement target based on the measurement value of the same subject and the past measurement value.
  11.  請求項7から10のいずれか1項に記載の処理装置において、
     前記評価手段が評価した前記測定対象の状態を示す情報を出力する出力手段をさらに有する処理装置。
    The processing device according to any one of claims 7 to 10,
    A processing device further comprising an output unit that outputs information indicating a state of the measurement target evaluated by the evaluation unit.
  12.  請求項11に記載の処理装置において、
     前記出力手段は、前記測定対象の状態が所定の状態である場合、前記測定対象の状態を改善するための情報又は前記情報を取得するための情報を出力する処理装置。
    The processing device according to claim 11,
    A processing device that outputs information for improving the state of the measurement target or information for acquiring the information, when the state of the measurement target is a predetermined state.
  13.  請求項1から12のいずれか1項に記載の処理装置において、
     前記画像を解析し、前記被験者が禁止動作を行ったことを検出する禁止動作検出手段をさらに有する処理装置。
    The processing device according to any one of claims 1 to 12,
    A processing device further comprising a prohibited operation detecting unit that analyzes the image and detects that the subject has performed a prohibited operation.
  14.  請求項13に記載の処理装置において、
     前記測定対象毎に前記禁止動作が予め登録されており、
     前記禁止動作検出手段は、選択された前記測定対象に関連した前記禁止動作を検出する処理装置。
    The processing device according to claim 13,
    The prohibited operation is registered in advance for each of the measurement targets,
    The processing device for detecting the prohibited operation related to the selected measurement target.
  15.  請求項13又は14に記載の処理装置において、
     前記算出手段は、選択された前記測定対象に対応した複数箇所を前記画像上で特定し、特定した前記複数箇所に基づき前記測定値を算出し、
     前記禁止動作検出手段は、選択された前記測定対象に対応した複数箇所を前記画像上で特定し、特定した前記複数箇所に基づき前記禁止動作を検出し、
     前記算出手段が第1の前記測定対象に対応して特定する複数箇所と、前記禁止動作検出手段が前記第1の測定対象に対応して特定する複数箇所とは異なる処理装置。
    The processing device according to claim 13 or 14,
    The calculating means specifies a plurality of locations corresponding to the selected measurement target on the image, calculates the measurement value based on the identified plurality of locations,
    The prohibited operation detecting means specifies a plurality of locations corresponding to the selected measurement target on the image, and detects the prohibited operation based on the specified plurality of locations,
    A processing device different from a plurality of locations specified by the calculation unit corresponding to the first measurement target and a plurality of locations specified by the prohibited operation detection unit corresponding to the first measurement target.
  16.  コンピュータが、
     可動域の測定対象を選択する選択工程と、
     被験者を撮影した画像を、選択された前記測定対象に対応した解析方法で解析することで、選択された前記測定対象の可動域を示す測定値を算出する算出工程と、
    を実行する処理方法。
    Computer
    A selection step of selecting a measurement object of the range of motion,
    A calculation step of calculating a measurement value indicating a movable range of the selected measurement target, by analyzing an image obtained by capturing the subject by an analysis method corresponding to the selected measurement target,
    Processing method to execute.
  17.  コンピュータを、
     可動域の測定対象を選択する選択手段、
     被験者を撮影した画像を、選択された前記測定対象に対応した解析方法で解析することで、選択された前記測定対象の可動域を示す測定値を算出する算出手段、
    として機能させるプログラム。
    Computer
    Selection means for selecting the measurement object of the range of motion,
    Calculation means for calculating a measurement value indicating a movable range of the selected measurement target by analyzing an image obtained by capturing the subject by an analysis method corresponding to the selected measurement target,
    A program to function as
PCT/JP2019/022386 2018-07-24 2019-06-05 Processing device, processing method and program WO2020021873A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2020532197A JPWO2020021873A1 (en) 2018-07-24 2019-06-05 Processing equipment, processing methods and programs

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018138529 2018-07-24
JP2018-138529 2018-07-24

Publications (1)

Publication Number Publication Date
WO2020021873A1 true WO2020021873A1 (en) 2020-01-30

Family

ID=69180480

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/022386 WO2020021873A1 (en) 2018-07-24 2019-06-05 Processing device, processing method and program

Country Status (2)

Country Link
JP (1) JPWO2020021873A1 (en)
WO (1) WO2020021873A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02139611U (en) * 1989-04-24 1990-11-21
JP2015061579A (en) * 2013-07-01 2015-04-02 株式会社東芝 Motion information processing apparatus
WO2016208291A1 (en) * 2015-06-26 2016-12-29 Necソリューションイノベータ株式会社 Measurement device and measurement method
WO2018087853A1 (en) * 2016-11-09 2018-05-17 株式会社システムフレンド Stereoscopic image generation system, stereoscopic image generation method, and stereoscopic image generation program

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011078728A (en) * 2009-03-10 2011-04-21 Shinsedai Kk Body state evaluation unit, state estimation unit, step estimation unit, and health management system
JP2012139480A (en) * 2010-12-15 2012-07-26 Shinsedai Kk Physical condition evaluation device, physical condition evaluation method, and computer program
JP6369811B2 (en) * 2014-11-27 2018-08-08 パナソニックIpマネジメント株式会社 Gait analysis system and gait analysis program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02139611U (en) * 1989-04-24 1990-11-21
JP2015061579A (en) * 2013-07-01 2015-04-02 株式会社東芝 Motion information processing apparatus
WO2016208291A1 (en) * 2015-06-26 2016-12-29 Necソリューションイノベータ株式会社 Measurement device and measurement method
WO2018087853A1 (en) * 2016-11-09 2018-05-17 株式会社システムフレンド Stereoscopic image generation system, stereoscopic image generation method, and stereoscopic image generation program

Also Published As

Publication number Publication date
JPWO2020021873A1 (en) 2021-08-12

Similar Documents

Publication Publication Date Title
US11182599B2 (en) Motion state evaluation system, motion state evaluation device, motion state evaluation server, motion state evaluation method, and motion state evaluation program
US11006856B2 (en) Method and program product for multi-joint tracking combining embedded sensors and an external sensor
US9870622B1 (en) Systems and methods for analyzing a motion based on images
JP5641222B2 (en) Arithmetic processing device, motion analysis device, display method and program
US9761011B2 (en) Motion information processing apparatus obtaining motion information of a subject performing a motion
JP6369811B2 (en) Gait analysis system and gait analysis program
JP6930995B2 (en) Stereoscopic image generation system, stereoscopic image generation method and stereoscopic image generation program
CN112970074A (en) Physical activity quantification and monitoring
JP2008173250A (en) Walking movement analyzer
US20220207921A1 (en) Motion recognition method, storage medium, and information processing device
JP6649323B2 (en) Gait analysis system and method
JP7409390B2 (en) Movement recognition method, movement recognition program and information processing device
CN115578789A (en) Scoliosis detection apparatus, system, and computer-readable storage medium
KR20230166319A (en) Device, method and program recording medium for identifying human 3d whole-body pose/motion
WO2020021873A1 (en) Processing device, processing method and program
JP6574068B2 (en) Human body model providing system, human body model providing method, human body model providing device, display terminal device, human body model displaying method, and computer program
WO2017056357A1 (en) Information processing apparatus, information processing method, and program
Barzyk et al. AI‐smartphone markerless motion capturing of hip, knee, and ankle joint kinematics during countermovement jumps
JP2009095631A (en) Golf swing measuring system
JP2021099666A (en) Method for generating learning model
WO2022208859A1 (en) Skill recognition method, skill recognition apparatus, and gymnastics scoring support system
TWI790152B (en) Movement determination method, movement determination device and computer-readable storage medium
US20230172491A1 (en) System and method for motion analysis including impairment, phase and frame detection
JP2024044529A (en) Evaluation system and program
JP2024087325A (en) Behavior analysis result output method, behavior analysis result output program, and behavior analysis result output system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19842077

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020532197

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19842077

Country of ref document: EP

Kind code of ref document: A1