Hello! I'm Hardik Gossain, a passionate Robotics and Control Systems Engineer with a Master’s degree from the University of Waterloo and two years of hands-on experience in the field. I specialize in designing robotic systems, developing advanced control algorithms, and conducting innovative research. My expertise spans AI, machine learning, sensor integration, and project management, which I leverage to create innovative solutions that address real-world challenges. I am always excited to collaborate with fellow professionals and enthusiasts to exchange ideas and push the boundaries of what robotics can achieve. Thank you for visiting my website – I look forward to connecting!
Hardik Gossain
8 Roosevelt Ave
Waterloo,ON N2L2N1 CA
+1 (647) 8654-089
hardikgossain9@gmail.com
Responsible for testing advanced film, television, and telecommunications equipment, leveraging automated test equipment (ATE). Designed and implemented test scripts in Python to streamline testing processes and enhance efficiency. Tasks include diagnosing and troubleshooting electronic components, analyzing performance and reliability, and building and updating test jigs. Collaborated with cross-functional teams to ensure timely delivery of high-quality products while adhering to documented procedures such as ECO and AS/400.
Designed and implemented AWS infrastructure (Lambda, S3, API Gateway) to streamline IoT data recording and analysis, completing the project within two months for a startup. Additionally, I developed multiple path planning algorithms for robots navigating on a grid-based map from source to destination. Collaborating with a PhD researcher, we delivered a robust solution that met all project specifications over a three-month period.
In my role, I calibrated a range of machines, including manipulators, to uphold precision standards in manufacturing parts for HVAC systems. I conducted comprehensive quality checks, encompassing both physical and electrical assessments, to maintain product integrity. Collaborating closely with team members, I ensured the timely completion of targets by coordinating efforts effectively. Throughout the manufacturing process, I consistently upheld exceptional quality standards, contributing to the overall success of the operation.
I contributed to the development of a pollution monitoring product launched in Delhi. I applied FreeRTOS on the STM32 to collect data from accelerometers, gyroscopes, and PM2.5 sensors and designed the PCB using Eagle. I used I2C, CAN, SPI, and UART protocols for sensor communication with the STM32 and Raspberry Pi. Before deployment, I performed unit, integration, and hardware testing to ensure the device's reliability.
In addition to above mentioned skills, I excel in various tools like AWS, Google Colab, Atmel Studio, and Eagle. Proficient in Cube IDE and Google Firebase, I craft robust software solutions. My expertise extends to IoT, WSN, STM32, ROS, SLAM, and communication protocols like I2C, CAN, UART, Ethernet, TCP/IP. I leverage Real-Time Operating Systems for responsive systems. This arsenal enables me to develop dynamic applications, manage cloud infrastructure, design intricate hardware, and navigate complex networking environments. With a blend of technical prowess and adaptability, I am primed to innovate and excel in diverse technological landscapes.
The project utilized image processing for hand movement tracking to steer the aircraft in the simulator, while voice recognition controlled general functions such as adjusting air brakes and accelerating on the runway.
This project utilized key concepts from multivariable control systems and linear algebra to develop a MATLAB model for an under-actuated robot. Stabilization tests were conducted to verify LQR controller accuracy in comparison with PID control.
The KUKA-YOU Bot was modeled in MATLAB and Simulink using Denavit–Hartenberg parameters. This enabled the application of forward and inverse kinematics, force dynamics, path planning, and optimization to perform pick-and-place operations. Additionally, image processing was used to detect the color of objects, allowing the robot to differentiate and handle them accordingly.
This research project combined the use of drones and manipulators. The three-degree-of-freedom robotic arm operated in two modes: an autonomous mode, where a camera detected objects for the end-effector to pick up, and a manual mode, where it followed the movements of an exoskeleton glove worn by the user. The drone featured trajectory following and altitude control for precise navigation.
This project involved developing an autonomous robot to simulate the plantation process in agriculture using image processing. The robot, powered by OpenCV on a Raspberry Pi, followed a line and detected colored markers along its path. These markers were then translated into a computer-generated image, simulating the planting of various flowers corresponding to the color and number of markers detected.
The two-wheeled mobile robot used infrared sensors to detect black lines on a white surface and included ultrasonic and color sensors for obstacle and object color detection. The robot's navigation area was stored in a matrix, and the A* algorithm was used for pathfinding. It located objects, avoided obstacles, and transported them to designated drop areas based on color. The programming was done in embedded C using Atmel Studio.
This project applies machine learning to Human-Robot Interaction (HRI), focusing on facial expression and hand gesture recognition. Two convolutional neural networks were trained: one to recognize facial expressions (happy, sad, neutral, surprised, angry) using the FER-2013 dataset, and another for hand gestures (stop, okay, thumbs up, peace). The facial expression model had five convolutional layers, while the hand gesture model had three. The live video feeds were fed into the network to identify the emotions and hand gestures.
The project consists of a sensor node equipped with sensors for recording land movement and soil moisture. This data is utilized to detect and predict the possibility of landslides in hilly areas. It uses GPRS connectivity to transfer data to a server. The data can then be viewed graphically to monitor and detect potential landslides.