LDR   05262ngm^^22003853a^4500
001        AA00000361_00001
005        20220607160558.0
006        m^^^^^o^^i^^^^^^^^
007        cr^^na---ma^mp
008        220607n^^^^^^^^xx^nnn^^^^^^^^o^^^^ueng^d
245 00 |a Electrical and Computer Engineering |h [electronic resource].
260        |c 04/01/2022.
520 3    |a Presenter: Jesse Smithers, Christian Mackey, Anthony Ficarrotta. Title: Semi-Autonomous Robot that Detects Hazardous Gases. Abstract: Gas leaks have become an increasing concern, as high-risk conditions can lead to complications affecting a wide range of industries. Robotics have been increasingly useful in higher-risk situations, specifically in fully and semi-autonomous designs. In utilizing programming and microcontrollers, a semi-autonomous gas detecting robot can be developed to create a safe work environment for all involved. This product would be capable of searching and determining the presence of particular gasses in a given area and informing the remote user of the saturation level. Being semi-autonomous, the robot would search an area and report any findings through a wireless interface, displaying current and prior gas readings to eventually identify a “hotspot.” The display also can provide live footage, in the case of the device being manually operated. If the robot is autonomously searching and an obstacle is sensed in its path, it will adjust its path accordingly. Our design is comparable to that of the SMP Robotics S6 robot and the RoboGasInspector. Incorporating Raspberry Pi as the main processing unit, the movement, and the data display can be programmed using Python. The interface will display the data for the ultrasonic sensors, MQ2, MH-Z19, and PM2.5. This project will represent an accumulation of the course work within SUNY Oswego’s Electrical and Computer Engineering program, as well as effort outside of class, such as in extensive literature reviews. This device would increase the overall safety of examining potential gas leaks if successful and implemented in society.
520 3    |a Presenter: Donald Berry, Taylor Wiltshire. Title: Head Gesture Controller. Abstract: The aim of this project is to give the performer increased capabilities during their performances by allowing them to have direct control of ongoing effects throughout their show. The device will function through the use of head gestures which will be determined by a machine learning algorithm that is loaded onto a laptop. These gestures will then be mapped to different outputs in order to allow for actions such as volume/reverb control which will be customizable based on the individual user’s needs. This flow of ideas is depicted further in Figure 1. The device will be mounted to the head of the performer to allow full range of movement in order to ensure the device only aids the performer rather than hinders them. The model will be trained with data gathered by performing successive head movements while having the device mounted to the head. With this data stored, it will then be split into testing and training sets such that the model can accurately determine each head movement. With each head gesture determined, the head band can accurately control augmentations to instruments such as vibrato and reverb.
520 3    |a Presenter: Abisola Akinfenwa, Kyle Lofgren. Title: UV Light Sanitizing Robot. Abstract: The proposed project is an autonomous, ultraviolet (UV) light sanitizing robot made for small areas, such as a desk. The idea comes from the COVID-19 pandemic, where an autonomous sanitizing robot could be beneficial in space to reduce exposure to the disease and keep people safe from other bacteria. Moreover, the robot can eliminate a large amount of waste while reducing a good amount of pollution at the same time. The autonomousness of the sanitizing robot comes from the mapping feature by combining microprocessor commands with obstacle and ledge sensors, ultrasonic and bump sensors. For this robot, we built a chassis and worked on making it on a tall scale and not a wider scale to keep the robot's size small enough to cover a good amount of surface area. The small size robot helps us maintain and make sure that the robot is more accurate when going around the table. We plan to map out the robot's direction with the use of our raspberry pi and sensor functions. This project can be used in multiple kinds of spaces, from classrooms to office spaces, and change how we see cleaning for the better.
520 3    |a Session chair: Sungeun Kim
533        |a Electronic reproduction. |c SUNY Oswego Institutional Repository, |d 2022. |f (Oswego Digital Library) |n Mode of access: World Wide Web. |n System requirements: Internet connectivity; Web browser software.
535 1    |a SUNY Oswego.
541        |a Collected for SUNY Oswego Institutional Repository by the online self-submittal tool. Submitted by Zach Vickery.
650        |a Quest 2022.
650        |a Electrical and Computer Engineering.
720        |a SUNY Oswego.
720 1    |a Smithers, Jesse. |4 spk
720 1    |a Mackey, Christian. |4 spk
720 1    |a Ficarrotta, Anthony. |4 spk
720 1    |a Berry, Donald. |4 spk
720 1    |a Wiltshire, Taylor. |4 spk
720 1    |a Akinfenwa, Abisola. |4 spk
720        |a Lofgren,Kyle. |4 spk
720 1    |a Kim, Sungeun. |4 spk
830    0 |a Oswego Digital Library.
830    0 |a Quest.
852        |a OswegoDL |c Quest
856 40 |u https://digitallibrary.oswego.edu/AA00000361/00001 |y Electronic Resource
997        |a Quest


The record above was auto-generated from the METS file.