Week 1: Setting responsibilities, goals and making a start on our project


  • About The Project 

Our project aims to develop an additional tool to help combat the growing issue of antibiotic use within the farming industry, especially in the rearing of cows and pigs. You may find the following article by PETA an interesting read: Read Here

Unfortunately, animals can be very susceptible to infections for various reasons, but one in particular relates to the conditions in which they are housed. This concern arises particularly when they are not grazing, which is where our focus will be placed. We are looking to develop a product that can measure the temperature and humidity of animal bedding to help farmers manage bedding cleaning practices. High humidity and temperature are prime conditions for bacterial and fungal growth, so if we can detect when these conditions arise, we can help indicate when bedding needs changing and prompt a health check for the animals that have been enclosed within a specific pen. More information on bedding management can be found in this attached study from 2011: Read Here

  • Getting started 

Week 1 has been centered around bringing together our collective ideas on our project that we've discussed over the Christmas and January period, and setting in stone clear objectives to ensure by week 4 we have a complete project.

Our team consists of four members: Vishnu, Aaron, Joiakim, & Brandon. We have decided to split our project up into the following key areas:

Navigation

This aspect of the project requires us to come up with a way to navigate autonomously within an animal enclosure (typically rectangular) and track our robot's position to be used in the Data Representation element we will mention later. This will require coming up with an algorithm to maneuver the enclosure, utilizing ultrasonic sensors for wall detection.

Sensors

This element of the project will be about selecting and utilizing sensors for collecting data on temperature and humidity, as well as utilizing ultrasonic sensors for wall detection.

Data Representation 

This is a key feature of our project, as ultimately the end-users of our project will only really interact with the data. Therefore, we need to produce it in a clear, easy-to-understand way. Our intentions are to plot a heatmap of temperature and humidity on an x-y axis graph, with a color gradient indicating the measured level at nodes (nodes will be evenly spaced out locations across the enclosure where data has been collected).

What We've Achieved Week 1 

Motor control 

We've managed to wire up and control 4 motors via two L298N Motor controllers connected to our Arduino Mega 2560 R3 microcontroller. This initial configuration and code utilize the serial port as its input, so I can use my laptop's keyboard for control. Future iterations for autonomous navigation will require this to be determined by an algorithm and sensor inputs. This was mainly to ensure an understanding of how to use the microcontroller and components to be later improved. Below is a video showing that we are able to drive the motors and control direction.


Temperature and humidity sensor 

We have utilised the Gravity: SHT31-F digital temperature and humidity sensor for our project. This is the device that will measure the conditions of the bedding, so we can generate a heatmap to help determine if it needs replacing. We have managed to successfully wire up the component and have it print to screen the data required. The next step is to find a way to store this data onto an SD Card so we can use it within the MATLAB space. The current code used will need to be altered for this purpose and therefore will be included within the blog at a later date.
 
Figure 1: Gravity: SHT31-F digital temperature and humidity sensor
Figure 2: sensor data output to screen 



Data Representation of temperature and humidity using MATLAB


To illustrate how the data will be presented in a user-friendly manner, we created a heatmap in MATLAB using simulated data. This heatmap visually represents the pen, showcasing data collected at each point traversed by the robotic car. Farmers, or end users, can easily interpret this data, understanding the temperature and humidity levels at various points on the farm. Such insights are crucial for identifying potential hotspots for bacterial and fungal proliferation. While the current data is simulated to demonstrate the concept of data mapping, our future endeavors will focus on integrating real data into this visualization to ensure it is both meaningful and actionable for the user.

Figure 3: Heatmap of the Temperature & Humidity

 



Using an ultrasonic sensor to detect a wall


We have used HC-SR04 ultrasonic sensor to both send an output wave out and then receive the same and its constantly sending out a wave in small bursts, working out the distance using the speed of sound and the time taken to come back and it will keep sending out waves until the distance worked out is less than a threshold that we set. This will then allow the car to adjust the code that is being ran, and then lets the car turn. This is essential for an automatic navigation car as it must be able to know when it will need to turn instead of it keep going into a wall.

Figure 4: Code for the Ultrasonic Sensor to detect a wall


Currently the code keeps sending out the waves and calculates the distance and when the calculated distance is less than the maximum distance which is defined at the start of the code, it will perform a stage where it will allow the car to turn.


Modelling a sensor holder so it is fixed to the chassis

During the week 1 lab session we was able to recognise as a group that the sensor would not be able to stand on the chassis solely on its own, so to counter this we decided to use the CAD software Inventor to create a holder that can be mounted onto the chassis. To begin with, it was important to look into the dimensions of the HC-SR04 ultrasonic sensor so that a rough understanding can be made on the dimensions that would need to be put onto the sketches. The CAD software allowed easy altercations on the dimensions and the first model of the ultrasonic sensor stand can be seen below.


Figure 6: Model of Ultrasonic sensor stand


Figure 7: Model of Ultrasonic sensor stand

The next step for this would be to create another model or the final product (ready for 3D printing) with the dimensions correct for two holes that can be fixed onto the corresponding area of the chassis as well as a bit more of a finishing touch with the edges. 


Comments

Popular posts from this blog

Week 4: Finalising Week

Week 2: Project progress & combining functionalities