5 results on '"Oguz Akkas"'
Search Results
2. Driver Movement Patterns Indicate Distraction and Engagement
- Author
-
John D. Lee, Oguz Akkas, and Robert G. Radwin
- Subjects
Adult ,medicine.medical_specialty ,Automobile Driving ,Psychometrics ,Poison control ,Human Factors and Ergonomics ,Motor Activity ,Suicide prevention ,050105 experimental psychology ,Occupational safety and health ,Behavioral Neuroscience ,Physical medicine and rehabilitation ,Distraction ,Injury prevention ,medicine ,Humans ,0501 psychology and cognitive sciences ,Attention ,CLIPS ,050107 human factors ,Applied Psychology ,computer.programming_language ,05 social sciences ,Human factors and ergonomics ,Observer (special relativity) ,Biomechanical Phenomena ,Psychology ,computer ,Psychomotor Performance - Abstract
Objective This research considers how driver movements in video clips of naturalistic driving are related to observer subjective ratings of distraction and engagement behaviors. Background Naturalistic driving video provides a unique window into driver behavior unmatched by crash data, roadside observations, or driving simulator experiments. However, manually coding many thousands of hours of video is impractical. An objective method is needed to identify driver behaviors suggestive of distracted or disengaged driving for automated computer vision analysis to access this rich source of data. Method Visual analog scales ranging from 0 to 10 were created, and observers rated their perception of driver distraction and engagement behaviors from selected naturalistic driving videos. Driver kinematics time series were extracted from frame-by-frame coding of driver motions, including head rotation, head flexion/extension, and hands on/off the steering wheel. Results The ratings were consistent among participants. A statistical model predicting average ratings from the kinematic features accounted for 54% of distraction rating variance and 50% of engagement rating variance. Conclusion Rated distraction behavior was positively related to the magnitude of head rotation and fraction of time the hands were off the wheel. Rated engagement behavior was positively related to the variation of head rotation and negatively related to the fraction of time the hands were off the wheel. Application If automated computer vision can code simple kinematic features, such as driver head and hand movements, then large-volume naturalistic driving videos could be automatically analyzed to identify instances when drivers were distracted or disengaged.
- Published
- 2017
3. Measuring exertion time, duty cycle and hand activity level for industrial tasks using computer vision
- Author
-
Oguz Akkas, David Rempel, Yu Hen Hu, Carisa Harris Adamson, Cheng Hsien Lee, and Robert G. Radwin
- Subjects
Engineering ,Feature vector ,Physical Exertion ,Decision tree ,Video Recording ,Physical Therapy, Sports Therapy and Rehabilitation ,Human Factors and Ergonomics ,Repetitive motion ,03 medical and health sciences ,0302 clinical medicine ,Computer vision algorithms ,Humans ,0501 psychology and cognitive sciences ,Computer vision ,Exertion ,Sensitivity (control systems) ,050107 human factors ,Simulation ,business.industry ,Computers ,05 social sciences ,Hand ,030210 environmental & occupational health ,Duty cycle ,Time and Motion Studies ,Artificial intelligence ,business ,Algorithms - Abstract
Two computer vision algorithms were developed to automatically estimate exertion time, duty cycle (DC) and hand activity level (HAL) from videos of workers performing 50 industrial tasks. The average DC difference between manual frame-by-frame analysis and the computer vision DC was โ5.8% for the Decision Tree (DT) algorithm, and 1.4% for the Feature Vector Training (FVT) algorithm. The average HAL difference was 0.5 for the DT algorithm and 0.3 for the FVT algorithm. A sensitivity analysis, conducted to examine the influence that deviations in DC have on HAL, found it remained unaffected when DC error was less than 5%. Thus, a DC error less than 10% will impact HAL less than 0.5 HAL, which is negligible. Automatic computer vision HAL estimates were therefore comparable to manual frame-by-frame estimates.Practitioner Summary: Computer vision was used to automatically estimate exertion time, duty cycle and hand activity level from videos of workers performing industrial tasks.
- Published
- 2017
4. Measuring Elemental Time and Duty Cycle Using Automated Video Processing
- Author
-
Yu Hen Hu, Cheng-Hsien Lee, Oguz Akkas, Robert G. Radwin, and Thomas Y. Yen
- Subjects
Male ,Computer science ,Feature vector ,Movement ,Acceleration ,Decision tree ,Video Recording ,Physical Therapy, Sports Therapy and Rehabilitation ,Human Factors and Ergonomics ,Kinematics ,Article ,03 medical and health sciences ,0302 clinical medicine ,Task Performance and Analysis ,Image Processing, Computer-Assisted ,Humans ,0501 psychology and cognitive sciences ,Computer vision ,050107 human factors ,Ground truth ,business.industry ,05 social sciences ,Repetitive task ,030229 sport sciences ,Video processing ,Hand ,Biomechanical Phenomena ,Duty cycle ,Video tracking ,Muscle Fatigue ,Female ,Artificial intelligence ,business ,Algorithms - Abstract
A marker-less 2D video algorithm measured hand kinematics (location, velocity and acceleration) in a paced repetitive laboratory task for varying hand activity levels (HAL). The decision tree (DT) algorithm identified the trajectory of the hand using spatiotemporal relationships during the exertion and rest states. The feature vector training (FVT) method utilised the k-nearest neighbourhood classifier, trained using a set of samples or the first cycle. The average duty cycle (DC) error using the DT algorithm was 2.7%. The FVT algorithm had an average 3.3% error when trained using the first cycle sample of each repetitive task, and had a 2.8% average error when trained using several representative repetitive cycles. Error for HAL was 0.1 for both algorithms, which was considered negligible. Elemental time, stratified by task and subject, were not statistically different from ground truth (p
- Published
- 2016
5. A hand speed-duty cycle equation for estimating the ACGIH hand activity level rating
- Author
-
Oguz Akkas, Sheryl S. Ulin, Robert G. Radwin, David P. Azari, Thomas J. Armstrong, David Rempel, Chia Hsiung Eric Chen, and Yu Hen Hu
- Subjects
Engineering ,Work ,Threshold limit value ,Movement ,Monte Carlo method ,Physical Exertion ,Physical Therapy, Sports Therapy and Rehabilitation ,Human Factors and Ergonomics ,Residual ,Article ,Root mean square ,Statistics ,Linear regression ,Task Performance and Analysis ,Range (statistics) ,Humans ,Threshold Limit Values ,Simulation ,Occupational Health ,Anthropometry ,business.industry ,Regression analysis ,Hand ,United States ,Biomechanical Phenomena ,Military Personnel ,Duty cycle ,Regression Analysis ,business - Abstract
An equation was developed for estimating hand activity level (HAL) directly from tracked root mean square (RMS) hand speed (S) and duty cycle (D). Table lookup, equation or marker-less video tracking can estimate HAL from motion/exertion frequency (F) and D. Since automatically estimating F is sometimes complex, HAL may be more readily assessed using S. Hands from 33 videos originally used for the HAL rating were tracked to estimate S, scaled relative to hand breadth (HB), and single-frame analysis was used to measure D. Since HBs were unknown, a Monte Carlo method was employed for iteratively estimating the regression coefficients from US Army anthropometry survey data. The equation: HAL = 10[e(-15:87+0:02D+2:25 ln S)/(1+e(-15:87+0:02D+2:25 ln S)], R(2) = 0.97, had a residual range ± 0.5 HAL. The S equation superiorly fits the Latko et al. ( 1997 ) data and predicted independently observed HAL values (Harris 2011) better (MSE = 0.16) than the F equation (MSE = 1.28).
- Published
- 2014
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.