University of Washington Tacoma CES Projects

Team "Triton" Remote Aquarium Viewing System

Our team had the opportunity to display Phase I of our prototype at a Seattle conference for the marine aquarium hobby in February of 2014. This version of the prototype allowed users to control the system locally from a custom hand-controller while viewing the video feed from a monitor next to the display. The design allows movement along the X, Y, and Z axes along with yaw of the camera housing and pitch of the camera itself.

See extra photos/videos

Problem Statement

Hobbyists and Public Aquarium Administrators are seeking a commercially available interactive video experience with the coral reef ecosystems under their care to facilitate close monitoring of the various competing life-forms in those environments for educational and health-related purposes. To that end, a viewing system is needed that will provide more viewing angles than a single stationary camera system can provide. The camera's video feed should be viewable locally on a monitor and it should be able to be controlled by some means other than by physically positioning and reorienting the camera by hand over the aquarium. Ideally, the camera video feed and position control should also be accessible over the internet. The control of stability of the viewing apparatus is critical because the welfare of the delicate aquatic life such as coral and fish is the highest priority. Therefore, some form of accident avoidance capabilities should be included in the system to prevent collisions with rocks, coral, and any projecting components used in the environment's life support system.

Design

After considering all of the constraints, objectives, and functions that need to be met by the design, a gantry style design was settled upon. This system was developed using stepper motors to drive the camera housing along the X and Y-axis. The Z-axis and camera yaw are driven with DC motors. All the motors are controlled with separate driver circuits: Pololu driver boards. The pitch of the camera is driven with a servo and will have two positions, horizontal and vertical (viewing straight down into the aquarium). There are limit switches placed on every axis to avoid any accidental damage to the aquarium or the system itself. All movement and sensor input are processed and controlled by an Arduino Mega, this was chosen because of the amount of I/O it supports and the time constraints put on the development of the project itself. A Raspberry Pi controls the camera because it supports high definition video as well as WIFI and, like the Arduino Mega, allows for rapid prototyping with the vast amount of open source software available to the public that can be utilized.

There were two major phases of development. The first phase focused on controlling the system locally with dual joysticks and a video feed to a local monitor. The second phase focused on migrating all control and video feed functionality to a web-based user interface. The Raspberry Pi microprocessor chosen was critical in the second stage of the build. It allowed the implementation of control of the system over the internet. The camera is housed in acrylic to allow it to be submerged under water. Acrylic will allow for maximum visibility while preventing corrosion of the electronic components. An initial "glass box" diagram of the design of the system can be seen below.


Phase I - Local Control

A physical design was mocked up in SketchUp to gain insight into the placement of all of the mechanical hardware needed to facilitate movement. Materials such as aluminium, stainless steel, and acrylic were used in the fabrication to mitigate the damage the system will sustain long-term due to its intended use over salt water. Many of the acrylic components such as the custom acrylic and poly bearing system that enables the rotation of the camera housing were fabricated on a laser etching machine.

After some initial fabrication, the various axis were ready for testing.


The video functionality was then integrated.


Phase II - Web Control

There were three major milestones to reach in this phase.

1. Refining control with multiple microcontrollers

Due to timing limitations of the existing microcontroller, the system was divided up into 5 major sections:

  •     Microcontroller for X axis, Y axis, and nocturnal light control
  •     Microcontroller for Z axis, rotation, and tilt control
  •     Microcontroller for obstacle avoidance
  •     Microprocessor for video transmission
  •     Microprocessor for web hosting and communicating I/O to MCU's

  • The system after being redesigned:

    2. Implementing a web interface

    After the initial web interface was implemented, testing began. We found that with fixed directions of travel along the X and Y axis, the user quickly became disoriented when the rotation of the camera housing was used. With the camera housing rotated 90 degrees off of the initial orientation, an attempted movement to the left as perceived by the user actually translated to a movement back along the Y axis.

    For this reason, the design had to be adjusted to make the user interface more intuitive. A sensor array was added to the rotational axis of the system. The system could now monitor the orientation of the camera housing while in use. This allowed us to add additional functionality to the system while in web mode that would provide the user a 1st person viewing perspective while using the web interface. Now, regardless of which direction the user is facing in the aquarium, the buttons to move left, right, forward, and backward translate to those movements by mapping the true X and Y axis movements to 8 discrete possible orientations of the camera housing.

    3. Designing and implementing obstacle avoidance capabilities

    A "halo" of 8 IR LED's/detector pairs was built into a sealed acrylic housing that surrounded the camera housing. The detector circuit was run through an op-amp to filter and amplify the signal received. After testing, we found that the sensor circuit is minimally affected by the most current aquarium lighting technology. Six Rapid LED PAR38 (Mixed Color with UV) bulbs were used in testing. More testing should be conducted to ensure that other common lighting configurations like MH, T-5 HO, and T-8 will also introduce minimal noise into the system. The current sensor array communicates interrupt signals to the rest of the system to indicate an obstacle present in 8 major directions of travel to prevent one or both motors from traversing further in the given direction.

        

    After months of testing and reconfiguring the sensor configuration, the final arrangement was settled on. Once the sensor package was sealing in the final acrylic housing, final tests were run to ensure the system was still operating as designed. Initially, a weighted moving average algorithm was implemented in the software to smooth out the data being recorded from the sensors. Later, a more simple average for each sensor pair was implemented in a loop. Twenty samples are recorded for each sensor pair and the data is then averaged and compared against a hard-coded threshold which is used to fine-tune the response.
    Data collected in testing the obstacle avoidance:

    The final step to complete the design was to add an additional initialization phase. When the system is first turned on or reset, the camera housing rotates back to its furthest limit while the housing is lifted out of the water. Rotating back to one of the limits ensures that the system initializes the rotational state to the same orientation every time. Lifting the camera out of the water ensures that all obstacles are avoided as the system resets to the origin. The next step moves the camera housing back to the origin so that it is out of the way for tank maintenance. The final step rotates the camera back to the optimal viewing position looking across the aquarium while the camera housing is lowered back down into the water.

    Final Presentation of the System


    Team Members for the Remote Camera System

    Faculty Sponsor:

    Robert Gutmann, Ph.D.