Criterion for Selecting a Communication Standard for Environmental Data
The Galapagos Archipelago is a nearly pristine island system that is host to a number of diverse biomes. In order to keep these biomes in their currently pristine state a sensor network must be designed to be as inobtrusive as physically possible. Thus there are two philosophical perspectives from which to view the problem: a multitude of small sensors or a few large sensors. In determining which path you are to pursue you have established a set of parameters for the remainder of the problem to which you are constrained. In our operating environment, a clusters of macro-sized sensors
Small sensors whose mass approach zero certainly conform to the idea of being unobtrusive. However, they do pose quite another problem: if the sensors are so small that they are not visible to the naked eye, how do you keep track of their location relative to where you placed them. This is in addition to obvious engineering problems that are generated in the manufacturing process as the size of the sensor approaches zero such as conductivity being impossible to control, signal strength drops to near zero, and other electrical engineering problems that cannot be resolved as of yet. In order to account for these inadequacies you must make the sensor larger to the point of being visible. Even when an object is the size of a cell phone, to have battery life of over a few days the communication distance drops to a mere hundred meters or so and we are left with a large scattering of sensors that appear as pieces of trash in the otherwise pristine environment where a single moderately larger sensor capable of generating similar information but being consolidated to a single location within a 25 square kilometer grid rather than 2500 individual sensors. Further the logistics of deployment and upkeep of such a network of approximately 800,000 sensors(if each sensor is spaced 100 meters from its nearest neighbor in a square grid pattern). Ultimately we would be faced with the impossible task of managing an unfathomable amount of information of which all is not that useful. It has been said that more information is always better, but this is not the case as in this instance. Better information is better, quantity is not a trump card as it will generate enormous amounts of erroneous micro-data fluctuations that may or may not be valuable to scientists as analyzing them is a nearly superhuman task. These irreconcilable problems generated by micro-size sensors necessitate a macro-sized solution to the problem.
Due to the distances involved and the low power consumption requirements, but relatively low bandwidth needs of the network the most obvious solution is a radio wave solution backed up by an on-site data storage system in the unavoidable event of a communications breakdown. While this consumes on the order of less than one watt per transmission, the transmissions only occur on the frequency of four per hour rather than continuously thus conserving a significant amount of power that would have otherwise been wasted.
Spectroscopic Data Transmission Handling
The data gleamed from the spectroscopy monitoring satellites has no need to be transmitted via the channels common to the sensor network. Since we are going to be using an already well established satellite network there is no need to worry about how the data they collect is relayed back to us. All that must be considered is the location of the data storage and how it is to be processed. Simply routing the data to the central server and then consolidating the data would present a
The data taken from the airplane spectroscopy, which is far more exacting than the data taken from the satellite on the order of from nanometer accuracy of airplanes at 40,00 feet to micro- or millimeter accuracy of satellites in low earth orbit. This information is far more selective than the wide gain of the satellite so we must first determine the areas on the island that require the most study and then relay that information to the aircraft pilot at the airport on the isle of Baltra giving him coordinates of a route to fly. The information collected in his flight whose frequency has yet to be determined, must then need
Tag Transmission Handling
Tagging terrestrial animals has been used for decades as a means for population counts and migratory patterns of animals. Unfortunately it has typically required a human to be in the field and observe the animal to record its movement. In this sensor network however, each tag is going to use an implementation of an emerging short distance standard defined by the Institute of Electrical and Electronic Engineers(IEEE) as standard 802.15.4. For more information on how the tags are going to function please refer to the Tagging portion of the Website as this is only a description of how the communication is slated to work.
The communication in this network needs only to flow one direction: from the tag to the Zigbee communication beacon at each of the sensor nodes on the islands. The tracking tags are different from the sensor nodes in that they do not need to communicate with each other nor store any data about their present location. They merely need to transmit their unique identification number and a timestamp so that we can create a time-based map of animal movement patterns throughout the zones based upon triangulation, or if enough sensor nodes are not available for connection, a more generalized area surrounding the sensor at a certain distance that can be determined by the relation of the difference between the timesamp on the packet and the time the sensor node receives it and velocity of the wave. The antennae placed inside the tagging wristband will be able to transmit far beyond the 100 meters originally anticipated by the standard due largely to the high gain factor of the antennae, hopefully reaching out to near the 2.5 kilometer mark by line of sight and thus allowing for a q. This is just enough to provide continuous coverage based upon our sensor station deployment scheme as described in the Terrestrial Sensor portion of the website.
This data collected by each of the sensor nodes is then incorporated into the datalogger in place at each of the sensor network nodes and is transmitted with the other information gathered by the sensor by the same radio frequency process as described in the Sensor Transmission Handling section.
Sensor Transmission Handling
As previously stated however, the range of transmission between the sensors is at most 8.3 kilometers for direct line of sight. Accounting for certain obstructions, we have placed sensors and/or repeaters at 5 kilometer intervals. Thus the transmitted data must be collected by some means or all the collection is for naught. Since the adjacent sensors are well within the transmittance range they serve as the perfect vehicle for this task in addition to their task of monitoring the environment. So in addition to merely transmitting data, the sensors will be retrofitted with an antennae that is capable of receiving as well as transmitting data. The collected data can then be relayed with the data of the sensor that received, it along with the identification number of the sensor that recorded it and a timestamp of when the record was taken to identify the location and time of the data, at the usual thirty minute interval. Due to the close spacing of the sensors, there will be instated an algorithm that compares the received data's timestamp and identification number with that of all those collected recently. If it has already processed the data then the data is regarded as erroneous and ignored. Otherwise it accepts the data, stores its timestamp and identification number for near future usage, and waits to transmit it to its "parent" node at the next communication cycle.
The "mother sensor node" of the entire island is located at the most central sensing location of each island. this allows for the minimization of transmittance energy of its child sensors to reach it. This mother node has one additional piece of of hardware that makes it unique among the other sensors. Instead of a radio transmitter, it houses a transceiver capable of communicating with the ARGOS satellite system. This system of four primary satellites and three reserve satellites are in sun-geosynchronous orbit around the planet ensuring that a satellite will always be available for data transmission. in the even that cloud cover or some other obstruction is temporarily blocking satellite data transmissions, the mother node will use its on site data storage device to maintain all the data waiting to be transmitted, then once the possibility of data transmission has occurred, it then begins broadcasting to ARGOS. This data is then relaid via satellite to an offsite data server in a major city, preferably not in a third world environment. This server is then delegated the task of information management and referencing.
Data communication is inherently risky. Lost bytes of information and dropped signals are a relatively common occurrence. Within our communication system, however, there is a redundancy and a security built within the periodicity of the sensor network design. No sensor or repeater is located excessively far from a nearest neighbor and because of this there will always be at least 2 sensors that hold the same information until confirmation that it has been received, uncorrupted, by the mother node and transmitted to ARGOS. Due to the intercommunication between the sensors in the network we believe that it is a reasonably secure system considering the frame of deployment and the remoteness of the Galapagos. Though this is like any system and there are weaknesses, the number of communication options between the sensors is not an overbearing weaknesses.
Algorithm Design and Data Handling
With our number of sensors, satellites, and ariel data and the amount of time that they will be gathering data the sheer volumes of information that will coming through the server will be staggering. While it is not unfathomable that a single computer is capable of handling all of the data processing tasks, a more elegant and public relations friendly solution is a distributed computing system similar to what the Search for Extra-Terrestrial Intelligence and the Great Internet Mersenne Prime Search employ to process their data: a distributable software solution that can be run on an individual's remote computer and that report their data to the server for further analysis by scientists. This approach provides a two pronged benefit in that it generates public support and enthusiasm for the research being conducted and continues to remind the general public of the situation in the Galapagos and their need for continual monetary and moral support. Further this program can be running as a low priority process all the time on a users computer so data is continually being gathered and evaluated. While this does require some publicity to be established, the overwhelming response to the other programs has been such that generating a stable client base will not be a problem within the environmentally aware activists of the world.
The algorithm that we will use to evaluate the data that is sent to the individual terminal is defined on short, medium, and long term scales. At this point in time we have no specific process or processes for determining what is going wrong, or correctly, in the ecosystem without an initial data set. The sensor network will be its own, best interpreter. The first few, approximately 100, sets of data that are collected will be compared with previously gathered data serve as the basis for the base line of how the ecosystem is functioning. Then once other data comes in we will be able to evaluate on a relativistic, rather than absolute, scale how the ecosystem is doing. This process could easily be converted to a computerized program that, through an automated means, evaluated what is occurring in the ecosystem and alerts scientists to its functionality. This data should return in a readily quantified, easily cognitive system of patterns that can then be evaluated by scientists or sent out again for further evaluations by different algorithms. The general idea is to minimize human effort applied to analysis so that it may be focused entirely on problems we see arising on the short, medium, and long term scales.