Read more here: http://www.sacbee.com/news/local/environment/article12964940.html#storylink=cpy
Via Ozy, an interesting look at the surprising link between mountain gorillas and iPhones:
Deep inside the misty, muddy jungle of Volcanoes National Park, elusive forest elephants, buffalo and hyenas roam around, foraging for food in the cool mountain air. A group of Rwandans in army-green fatigues crouch down, too, nestled among the bushes, observing a troop of mountain gorillas. They’re not guarding the animals, or chasing poachers. Nor are they taking a work break to revel in the natural glory all around them.
No, they’re counting.
It’s part of a massive undertaking: Seventy rangers, zoologists and local guides are trying to count every living mountain gorilla. You’d think it would have been done by now — aren’t we always hearing alarmingly low figures on various threatened animal populations? But in the past, determining hard-and-fast numbers on the mountain-gorilla population was nearly impossible.
The rangers invited him along to the scene of the crime deep in the bush, where they found a rhino with her “face chopped out” and a baby rhino, terrified, protecting her mother.
Welcome to a new, futuristic era of animal conservation, where wildlife tech — in this case, high-end iPhone-style trackers — is being used to track and help save our favorite creatures, whether on land, in the air or in the sea. Technological innovation is replacing the old image of a lone field biologist carrying binoculars and a notebook with things like rangers in Kenya using drones to fight poaching in 52 game parks. You can thank the little computers in every scientist’s pocket, which make it easier to do things like send text messages from the collars of Kenyan elephants to rangers when problem pachyderms come near fields, so the animals won’t get shot and the villagers can still eat. It’s making crime-fighting citizen science a reality, with apps like WildScan in Thailand and Vietnam that let people identify and report the sale of more than 300 illegal pets, from a Southeast Asian box turtle to a pygmy slow loris (we’re looking at you, Rihanna).
Sophisticated criminal networks make big bucks from wildlife trafficking and killing animals like elephants, rhinos and tigers — an industry worth up to $10 billion a year, according to the World Wildlife Fund. That ranks it as the fourth-biggest illicit industry behind drug, human and fake goods trafficking. But wildlife tech is fighting back. Ralph Clark, CEO of ShotSpotter, was in South Africa’s Kruger National Park when the system, which pinpoints gunfire coordinates and alerts rangers within 30 seconds, detected gunshots. Poachers were on the hunt. The rangers invited him along to the scene of the crime deep in the bush, where they found a rhino with her “face chopped out” and a baby rhino, terrified, protecting her mother. Though the poachers got to the mom, the park rangers were able to save the baby, which would have otherwise died without her mother. Sadly, this rhino was one of 1,215 killed in South Africa in 2014.
Conservationist and founder of Mongabay Rhett Butler says it’s remarkable how reluctant most wildlife biologists were to veer from more traditional uses of tech in the field, like camera traps or tagging. There’s been a disconnect between tech engineers and the jungles of Southeast Asia, or the Kalahari Desert in southern Africa, but now cheaper and more prevalent gadgets — not to mention easier-to-grab data from satellites and the cloud — make it more accessible and compelling. This trend is linking together biologists, NGOs, government agencies and techies who want to stop crime, for the sake of the animals and because it’s been said to finance things like terrorism and sex trafficking. ShotSpotter’s original purpose, for example, was to detect and alert police of gunfire in 90 cities across the U.S. Even the U.S. government is trying to translate tech to the conservation context, with a $500,000 contest for innovative ideas to bust the bad guys.
Part of fighting poaching means tracking populations. When rangers know precisely how many gorillas or rhinos live in an area, they are aware when one goes missing. And with polished figures comes better data for scientists and more accurate conservation policies, which can mean bigger animal populations and more tourism, which can equal more money. It’s like an exponentially positive chain reaction. In places like Rwanda, more gorillas has meant far more money. The gorillas had more than 20,000 visitors in 2014, a threefold increase in 11 years, according to government figures.
It’s a fascinating story, not just about tech, or nature, or the economy, but also about the interconnectedness of it all. Most governments, particularly cash-strapped ones, are not going to conserve wildlife for ethical reasons. But if they see a way for the wildlife to bring in revenue — more gorillas, more money — it’s a different ball game. And every week it seems that iPhone-size technology has new applications.
But some tech toys can do more harm than good in the wrong hands. South African rangers are reluctant to reveal how or where they arrest would-be poachers when the ShotSpotter system is in play because it might reveal the location of the system, which would defeat the whole purpose. The ShotSpotter system underwent a camouflage makeover so poachers can’t find and disassemble it. There’s also the very real issue of corruption — its second shot ever captured, for example, uncovered an inside poaching job. There’s no point in having a detection system if park employees are passing along information. The fear of this has bigger consequences, too. Big-budget buyers like the U.S. Department of Defense are happy to keep tech prices artificially high to keep the gadgets in the “right” hands.
Then there’s the unsexy side of tech in the field. A lot of what’s useful inSilicon Valley simply doesn’t translate to rural areas or match on-the-ground realities — like harsh environments or local perceptions. There are “concerns about rich Westerners using drones to essentially spy on poor communities who may be poaching,” says Butler. Or using technology that doesn’t seek biologist input first. White rhinos, for one, can wear certain collars just fine, but put the same thing on a black rhino and it will rub its ears off. “The reality is … a lot of money can be wasted,” says Eric Dinerstein, director of WildTech at Resolve.
The last big hurdle is something we take for granted: cell connectivity. Conservationists hope the Facebooks and Googles of the world will have success with their Internet airplanes and other futuristic projects aimed at connecting the developing world (where most threatened species live). Because all the tech on the planet won’t make a difference if you can’t actually call for backup when you know where a poacher is.
,
Read More »Via GreenBiz, an article on five lessons from the field of technology and conservation:
What do you get when data scientists, imagery experts and sensor advocates mingle with conservation specialists? A gathering that looks a lot like this year’s World Wildlife Fund Fuller Symposium, appropriately themed “Wired in the Wild.”
The “big” question addressed by this year’s speakers: “Can technology save the planet?” The qualified answer: it will play an essential and increasingly inevitable role, with the right human intervention. “Environmental science depends on software and data,” noted Victoria Espinel, president and CEO of tech trade group BSA, the Software Alliance, in her keynote remarks.
That innovation comes in many shapes — from satellite imagery that helps detect illegal logging to the Internet of “living things” that is being used to catalog bird, mammal and fish migratory patterns. “Imagery is great as a tool to show people things they would not otherwise be able to see,” said John Amos, president of SkyTruth, an organization that has used satellites to track everything from oil spills to mountaintop strip mining.
As a moderator for this year’s symposium, I was granted a front-row view into some of these compelling experiments. Here are some of my takeaways from this extraordinary conference, which took place in November at the National Geographic headquarters in Washington, D.C.
1. Collaboration is crucial
Certainly, the goal of an individual wildlife or habitat research project is very local in nature. Literally. Still, scientists could benefit from sharing best practices about how to use specific technologies — a theme that emerged time and again during the Fuller Symposium presentations.
Sharing is the mission behind a new online community, Wildlabs.net, launched in mid-November by an alliance of influential conservation organizations with the support of Google and chip designer ARM — and trumpeted loudly during the conference.
“By bringing together experts from multiple sectors, we can create solutions to challenges that have proven impossible to solve in isolation,” said Ian Ferguson, vice president of worldwide marketing and strategic alliances for ARM. “Whether that is by enabling communities to apply new tools in protecting their natural resources, transforming our understanding of wildlife and natural systems through tracking and listening technologies, or supporting the detection and prevention of poaching in protected areas, technology can be a powerful tool for the conservation cause.”
The organizations participating in Wildlabs are Conservation International, Fauna & Flora International, International Union for Conservation of Nature, Nature Conservancy, Wildlife Conservation Society, WWF, Zoological Society of London and the Royal Foundation.
2. Community participation matters
Many of the most successful projects discussed during this year’s Fuller Symposium shared this trait: the organizing researchers worked closely with members of the local community to encourage participation.
One example comes from Uganda, where wildlife veterinarian and game warden Margaret Driciru counts on local rangers to submit data about injured or dead animals they find throughout the countries parks and preserves. The intention is to minimize human and wildlife conflicts. By asking rangers to watch for signs, Driciru was able to educate a broader part of the community about the issue.
Google Labs product lead Katharine Chou offers another illustration, from her work with the Lewa Wildlife Conservancy in northern Kenya. The organization uses tracking technology and Google’s StreetView mapping software to follow the migratory routes of several endangered species, including elephants. The data uncovered by this initiative inspired the creation of a “throughway” that opens up the animals’ traditional migratory route, while allowing humans to traverse the habitat. “We’re helping reroute animals before there is a conflict,” Chou said.
3. Keep an open mind
Any researcher using technology — anything from sensors and wireless communications to specimen-collection equipment — should expect many humbling false starts.
Consider the story of National Geographic David Gruber, a pioneer in deep ocean exploration. Gruber recalls his early expeditions in an “exosuit,” a $600,000 metal suit that looks much like what astronauts use for space exploration.
While Gruber expected the 530-pound suit to make sample collection at depths of up to 1,000 feet simpler, his preference now is for a small bubble-like submarine, which makes it easier for divers to see much more of the ocean’s floor. “My whole view on what the deep sea is like changed in a minute,” he said.
Gruber’s early missions also revealed a shortcoming in the design of the collection arms: The traditional claws were tough on delicate specimens. Now the team is experimenting with a design that mimics an octopus tentacle, strong yet gentle. Far more appropriate for gathering objects from the ocean floor.
4. Consider the ethics of collecting data carefully
Time and again, the Fuller Symposium speakers pointed up the need for more education about how much conservation data should be shared publicly — and what should be secured to protect vulnerable animals. Key goals should be accurate and accessible information, collected in the most non-intrusive way possible.
“We need to responsibly unleash data in an appropriate way,” said D.J. Patil, America’s chief data scientist and head of the White House Office of Science and Technology Policy.
Contrary to what many people believe, Microsoft Research scientist Lucas Joppa suggests the conservation community suffers from a lack of information — many databases aren’t being kept up to date with real-time information, he noted.
On the other hand, social networks such as Facebook reach almost 20 percent of the world’s connected population. More attention must be paid to bridging this divide. “How do we use the information age to counter the impact of the anthropecene?” Joppa asked the Fuller Symposium attendees rhetorically.
5. Don’t underestimate the role of the private sector
A who’s-who list of software companies — ranging from Autodesk to Google to IBM to Microsoft— is contributing intellectual property to conservation projects, several of which were mentioned during symposium presentations. Autodesk’s software, for example, is being used to capture and create highly detailed 3-D images of how coral reefs off the Hawaii island of Molokai are changing over time.
Google’s outreach is well-documented: close to 200 WWF projects are visualized through the Google Earth’s mapping service. This is allowing scientists, students and citizens unprecedented access to information about habitats, and endangered and invasive species.
Environmental scientist Jack Dangermond is driving a similar crusade in “spatial literacy” with his company Environmental Systems Research Institute (Esri). Schools are a major focus.
“If you look at the future of the world, it doesn’t look so good,” Dangermond told GreenBiz in November 2014. “Kids are not understanding all the patterns and relationships of climate change or urbanization or population growth. This is a way to get them to understand those parameters, in a do-it-yourself, learn by doing, project-based learning environment. Teachers love this. The kids go crazy, when you see them, if they have a good teacher.”
For his part, Patil underscores the role of private sector involvement in advancing conservation that the public sector can’t afford to fund on its own. “People are combining data in very innovative ways,” he said during the symposium. “It’s not a debate, it’s tangible.”
,
Read More »Via
n early October, after the main rainy season, Ethiopia’s central Rift Valley is a study in green. Fields of wheat and barley lie like shimmering quilts over the highland ridges. Across the valley floor below, beneath low-flying clouds, farmers wade through fields of African cereal, plucking weeds and primping the land for harvest
IIn the paved and wired developed world, it’s hard to imagine a food emergency staying secret for long. But in countries with bad roads, spotty phone service and shaky political regimes, isolated food shortfalls can metastasize into full-blown humanitarian crises before the world notices. That was in many ways the case in Ethiopia in 1984, when the failure of rains in the northern highlands was aggravated by a guerrilla war along what is now the Eritrean border.
Senay, who grew up in Ethiopian farm country, the youngest of 11 children, was then an undergraduate at the country’s leading agricultural college. But the famine had felt remote even to him. The victims were hundreds of miles to the north, and there was little talk of it on campus. Students could eat injera—the sour pancake that is a staple of Ethiopian meals—just once a week, but Senay recalls no other hardships. His parents were similarly spared; the drought had somehow skipped over their rainy plateau.
That you could live in one part of a country and be oblivious to mass starvation in another: Senay would think about that a lot later.
,
Read More »Via Foreign Affairs, a detailed look at how technology is transforming agriculture:
Thousands of years ago, agriculture began as a highly site-specific activity. The first farmers were gardeners who nurtured individual plants, and they sought out the microclimates and patches of soil that favored those plants. But as farmers acquired scientific knowledge and mechanical expertise, they enlarged their plots, using standardized approaches—plowing the soil, spreading animal manure as fertilizer, rotating the crops from year to year—to boost crop yields. Over the years, they developed better methods of preparing the soil and protecting plants from insects and, eventually, machines to reduce the labor required. Starting in the nineteenth century, scientists invented chemical pesticides and used newly discovered genetic principles to select for more productive plants. Even though these methods maximized overall productivity, they led some areas within fields to underperform. Nonetheless, yields rose to once-unimaginable levels: for some crops, they increased tenfold from the nineteenth century to the present.
Today, however, the trend toward ever more uniform practices is starting to reverse, thanks to what is known as “precision agriculture.” Taking advantage of information technology, farmers can now collect precise data about their fields and use that knowledge to customize how they cultivate each square foot.
One effect is on yields: precision agriculture allows farmers to extract as much value as possible from every seed. That should help feed a global population that the UN projects will reach 9.6 billion by 2050. Precision agriculture also holds the promise of minimizing the environmental impact of farming, since it reduces waste and uses less energy. And its effects extend well beyond the production of annual crops such as wheat and corn, with the potential to revolutionize the way humans monitor and manage vineyards, orchards, livestock, and forests. Someday, it could even allow farmers to depend on robots to evaluate, fertilize, and water each individual plant—thus eliminating the drudgery that has characterized agriculture since its invention.
ACRE BY ACRE
Someday, farms might be filled with hundreds of small autonomous robots.
The U.S. government laid the original foundations for precision agriculture in 1983, when it announced the opening up of the Global Positioning System (GPS), a satellite-based navigation program developed by the U.S. military, for civilian use. Soon after, companies began developing what is known as “variable rate technology,” which allows farmers to apply fertilizers at different rates throughout a field. After measuring and mapping such characteristics as acidity level and phosphorous and potassium content, farmers match the quantity of fertilizer to the need. For the most part, even today, fields are tested manually, with individual farmers or employees collecting samples at predetermined points, packing the samples into bags, and sending them to a lab for analysis. Then, an agronomist creates a corresponding map of recommended fertilizers for each area designed to optimize production. After that, a GPS-linked fertilizer spreader applies the selected amount of nutrients in each location.
Over 60 percent of U.S. agricultural-input dealers offer some kind of variable-rate-technology services, but data from the U.S. Department of Agriculture indicate that in spite of years of subsidies and educational efforts, less than 20 percent of corn acreage is managed using the technology. At the moment, a key constraint is economic. Because manual soil testing is expensive, the farmers and agribusinesses that do use variable rate technology tend to employ sparse sampling strategies. Most farmers in the United States, for example, collect one sample for every two and a half acres; in Brazil, the figure is often just one sample for every 12 and a half acres. The problem, however, is that soil can often vary greatly within a single acre, and agricultural scientists agree that several tests per acre are often required to capture the differences. In other words, because of the high cost of gathering soil information, farmers are leaving productivity gains on the table in some areas of the field and overapplying fertilizer and other inputs in others.
Researchers are beginning to tackle the problem, developing cheap sensors that could allow farmers to increase their sampling density. For example, one new acidity sensor plunges an electrode into the soil every few feet to take a reading and records the GPS coordinates; manually sampling on that scale would be far too costly. Such sensors have not yet arrived at most farms, however. Some haven’t proved reliable enough, breaking after a few acres of use, whereas others aren’t accurate enough. But several research groups around the world are working on developing sturdier ones.
More practical are sensors that look at the color of plants to determine their nutritional needs. Plants with too little nitrogen, for example, tend to turn pale green or yellow, whereas those with enough appear dark green. Several U.S. and European companies have developed sensors that detect greenness, generating measurements that can be used to generate a map recommending various amounts of nitrogen to be applied later. Alternatively, the measurements can be linked directly to the nitrogen applicator to change the application rate on the go. A tractor may have a sensor mounted on the front and an applicator on the back; by the time the applicator reaches a point that the sensor has just passed, an algorithm has converted the readings into settings for how much fertilizer to apply. Because research in this area has focused mainly on small grains, such as wheat, barley, rye, and oats, the technology is mostly limited to the parts of the United States and Europe that grow those crops. According to a 2013 survey by Purdue University, only seven percent of agricultural-input dealers offer plant-color sensors. Given the number of start-ups in this area, however, it is clear that many investors see the technology as a potential gold mine.
FIELDS AND YIELDS
The government’s GPS decision also enabled another revolutionary technology to emerge: yield monitoring. Most harvesters in the United States and Europe are outfitted with special sensors that measure the flow rate of grain coming in. An algorithm specific to the crop then converts the resulting data into a commonly used volume or weight, such as bushels per acre or kilograms per hectare. That information is then turned into colorful maps that show the variation within fields.
These maps have become a staple of farming magazines and trade shows, and for good reason: they have given farmers unprecedented insight into the effects of various production techniques, weather conditions, and soil types. Such a map can help a farmer arrive at yield numbers for the purpose of insurance or government programs, measure the results of experiments that test the qualities of genetically modified crops or the effectiveness of various cultivation practices, and reveal which parts of a field aren’t living up to their potential. In the eastern United States, it was only through yield monitoring that farmers were able to convince landlords that flood-related crop losses were not limited to completely submerged parts of the field; they also extended to a ring around those spots. In response, farmers installed more subsurface drainage systems. In Argentina, the technology has taken off because most managers of large farms there, unlike their U.S. counterparts, rarely operate vehicles themselves (a consequence of the peculiar history of landownership there). For them, yield maps offered on-the-ground insight into productivity they couldn’t otherwise get.
When it comes to the quality of the data, however, yield-monitoring technology still has a long way to go. In most cases, the algorithms that convert data about flow into volume or weight measurements must be calibrated annually for each crop and farm, and many farmers don’t bother to do so. The data can also be affected by how fast a harvester is driven and other idiosyncrasies. And although research studies can rigorously analyze data from yield monitoring, farms and agribusinesses typically lack the necessary statistical skills and software. The next step in yield monitoring is for agribusinesses to adopt the statistical techniques now used mainly by researchers; since their findings would be spread across millions of acres, they should be able to justify the cost.
PLOW BY WIRE
The most common use of precision-agriculture technology is for guiding tractors with GPS. Manually steering farm equipment requires skilled operators and is a tiring endeavor. And even the best drivers often overlap their passes by as much as ten percent to avoid skipping parts of the ground. The late 1990s saw the introduction of LED light bars, each a series of LED lights in a foot-long plastic case that is mounted in front of the operator of a tractor, harvester, or other vehicle. If the lights in the center are lit up, then the equipment is on track. If those on the left or the right are illuminated, then the driver needs to correct the steering.
Increasingly, farmers are taking this technology to its next logical step, replacing the light bars with automatic guidance systems that link GPS data directly to a vehicle’s steering mechanism. Although an operator still needs to sit on the equipment, for the most part, it can be driven hands-free. The technology first gained widespread use in the 1990s in Australia, where clay-rich soils—plus a lack of freezing and thawing—make fields particularly vulnerable to compaction from wheeled vehicles. Australian farms used GPS automated guidance to concentrate equipment traffic on narrow paths, preventing the rest of the soil from getting compacted. Today, about 40 percent of fertilizer and other agricultural chemicals are applied with automated guidance in the United States.
Such systems have led to numerous spinoffs. One category is mechanisms that track the path of a tractor and automatically shut off its seed-planting and chemical-spraying functions when it passes over parts of the field that have already been covered or are environmentally sensitive. The technology is especially useful for irregularly shaped fields, which are vulnerable to overplanting and overspraying.
Geospatial data aren’t just for plowing straight lines, however. For decades, NASA and some of its foreign counterparts have encouraged farmers to use their satellite imagery. Along with aerial photography, these images form the basis for “geographic information systems,” which enable farmers to store and analyze spatial data. The technology has proved particularly useful in areas where multiyear data are available, since it allows growers to divide large fields into zones that receive different seeds, fertilizers, and herbicides.
Some managers of farms are even using GPS to keep an eye on their employees in the field, especially in the former Soviet Union and particularly in Ukraine. Since the biggest farms there—many of which cover over 100,000 acres—tend to rely on hired staff and not owner-operators, farm managers like to track all field operations in real time. If a tractor stops for more than a few minutes, for example, the head office will notice and can call the driver to inquire about the problem. The tracking technology also allows managers to crack down on employees who use company machines on their own farms.
YIELD OF DREAMS
Precision agriculture has already turned one of the oldest sectors into one of the most high-tech, but the best is yet to come. The next step likely involves “big data.” Farmers and agribusinesses are increasingly considering how to best take advantage of their treasure troves of data to boost profits and make agriculture more sustainable. In 2013, for example, the agriculture giant Monsanto acquired the Climate Corporation, a start-up founded by two Google alumni to use weather and soil data to create insurance plans for farmers and generate recommendations for which crop varietals are best suited to a particular plot of land. Another low-hanging fruit for big data is research on how to use equipment. For example, it’s not clear how fast a tractor should be driven when planting corn: too slow makes for an inefficient process, but too fast results in uneven planting, which hurts yields. After collecting data on the tractor’s speed, the eventual yield of the crop, and other factors, however, one could determine the optimal speed for planting.
In order to harness big data’s power, companies will probably have to pool information across farms. In the United States and Europe, individual farms are too small to generate a meaningful quantity of data, and even the very large farms in Latin America and the former Soviet Union would benefit from combining data with their neighbors. The problem, at the moment, is that farmers have little incentive to collect quality data. In the United States, some start-ups have tried to pay farmers for data, without much success. So far, it is the agricultural-input suppliers and agricultural cooperatives that have been able to collect the most data. But even their data sets are relatively small.
Some of that big data may come from drones. With the United States largely out of Afghanistan and Iraq, some suppliers of military hardware have turned their attention to the agricultural market. The move might be smart: small, unmanned aircraft can capture regular images of crops to guide irrigation, pesticide application, and harvesting. And unlike satellites, drones are largely unaffected by cloud cover. Given the operating expense and expertise required, drones will most likely be used commercially at first only for high-value crops, such as wine grapes. And in the United States, the Federal Aviation Administration will first have to open up the skies to commercial drones.
The technology that would truly transform agriculture as we know it is robotics. The rapid adoption of GPS guidance has opened the door to more autonomous farm equipment, and most major manufacturers have already tested driverless versions of their tractors. Once the driver is removed from the picture, the design criteria for a piece of equipment change radically: it can become far smaller. It’s possible to imagine farms someday filled with hundreds of small autonomous robots, doing everything from planting to harvesting. Robots could scout fields continuously and identify pest and disease problems at the earliest possible stages. They could apply pesticides in tiny doses, targeting individual insects or diseased plants. They could efficiently manage small and oddly shaped fields, such as those common in the eastern United States, which are hard to farm profitably with conventional equipment driven by humans. In the United States, by reducing the need for Mexican laborers, robots might even affect immigration policy.
When it comes to emerging technologies, it is a fool’s errand to pick winners. But the history of farming in the twentieth century offers some clues to its future. Almost all the agricultural technologies that were widely adopted in the twentieth century were characterized by what economists call “embodied knowledge,” meaning that the scientific advancements were contained within them. Farmers didn’t have to know how pesticides killed insects or how a gasoline tractor worked; they just needed to know how to spray the chemical or drive the vehicle.
Likewise, the tools of precision agriculture will gain widespread use only once they are sold in easy-to-use forms. That’s why GPS guidance has become so widespread: farmers don’t need to understand it to use it. And so variable rate technology for fertilizer, to take one example, will take off the day a farmer can trigger it with the mere push of a button. Eventually, precision agriculture could take humans out of the loop entirely. Once that happens, the world won’t just see huge gains in productivity. It will see a fundamental shift in the history of agriculture: farming without farmers.
,
Read More »Via the Sacramento Bee, a look at how drones are revolutionizing conservation:
When the rain finally came to Sacramento in early February, Nature Conservancy scientist Chris McColl needed to quickly assess whether water had overflowed the banks of the Cosumnes River and filled a floodplain the organization is trying to restore.
Planes are expensive, and it takes hours to hire one and get it in the air. So McColl deployed a drone instead.
Cheap, fast and flexible, drones are quickly becoming a favored tool for organizations conducting scientific research. The Nature Conservancy has four drones operating from San Diego to the northern Sierra. The remote aircraft count sandhill cranes, survey flood restoration projects and perform other tasks.
McColl said the conservancy would double its California drone arsenal within the next year.
At the Cosumnes River Preserve south of Sacramento, where a levee was recently removed to allow flooding to resume its natural, historic pattern, scientists use a drone to see if their efforts are paying off. They previously would have had to eyeball the flooding from the ground – not the most accurate way to measure it – or hire an airplane to fly over.
“If we had to hire a pilot, the time flexibility might not be there,” McColl said. “The drones allow us to be really responsive after storm events because we need to be there within hours to catch certain flood levels.”
At the Cosumnes River Preserve, the conservancy is using a Phantom Vision 2 drone that sports four propellers and a GoPro camera attached to a gimbal.
Total cost: $1,400. The costs of repeat flights are minimal: manpower and battery juice. In contrast, hiring a pilot to fly over the site can run anywhere from $1,500 to $3,000 per trip.
The drone has the potential of paying for itself in just one trip. It also gives the conservancy complete control over the images captured. The only limitations are short battery times and noise levels.
Although small, the Phantom Vision 2 emits a high-pitch whirring sound that scares off sandhill cranes, McColl said. Quieter drones are available in the form of a fixed-wing model that is quiet and has not scared birds, but those drones cost $10,000 to $30,000 each.
Another scientist who has been using drones for research is Christopher Zappa, ocean and climate researcher at the Lamont-Doherty Earth Observatory at Columbia University.
Zappa has deployed drones since 2011 to test new instruments that measure ocean temperature and ocean waves. He also uses drones to drop off micro-buoys that can measure water temperature and salinity.
“We’re looking at the marginal ice zone, where the ocean meets the edge of the ice, seeing how fast and how slow the melting is occurring,” Zappa said. “Monitoring ice melt in the Arctic, drones will fly to places that icebreakers and manned aircraft don’t dare venture.”
Drones are not limited to the air. Recently, the Woods Hole Oceanographic Institution started monitoring polar ice with underwater drones. A similar drone is being used to observe the deep-water habits of great white sharks.
The technology is still in its infancy, as are the laws that regulate its use. Last month, the Federal Aviation Administration issued a ruling on allowing commercial use of drones, under strict guidelines.
The ruling, which needs final approval, allows drones to be flown for recreational purposes. For commercial use, a pilot’s license is required.
The new FAA rules allow the operating of drones under 55 pounds. The drones must be flown below 500 feet, kept 5 miles from an airport, and are not to be flown directly overhead of people. The drone operator must also maintain a constant visual line of sight with the drone.
With the Cosumnes flyovers, no pilot’s license was necessary since the conservancy was flying over its own land.
The river and adjacent fields are a living floodplain laboratory unusual in California, since the Cosumnes is one of the last rivers without a dam.
“The river is pretty dramatic because it’s a seasonal river that goes from completely dry to flooded in winter,” said Judah Grossman, project manager with the Nature Conservancy. “This is unique to the Cosumnes.”
“We’re hoping that allowing the river to overtop more frequently into a larger area will let floodwater percolate down and recharge the aquifer,” Grossman said. “And, hopefully, this will bring groundwater levels up again.”
Groundwater recharge is a big issue for a thirsty state in a multiyear drought. Last month, the U.S. Geological Survey reported that California is depleting its groundwater faster than any other state in the country.
The drones capture time-lapse photos and video that will help show whether the conservancy’s efforts are succeeding.
,
Read More »Via Outside Magazine, a brief look at the convergence of big data and conservation:
Poaching is big business, with black-market elephant tusks bringing in $30,000 and rhino horns fetching $300,000. But conservationists are fighting back with an arsenal of increasingly effective high tech weapons.
WildLeaks (wildleaks.org) lets anyone submit tips about poachers, information that vetted and shared with law enforcement.
Eyes on the Forest (eyesontheforest.or.id) fights Indonesian deforestation by using Google Earth to combine satellie tech with ground-based reporting.
In 2013, no elephants, rhinos, or lions were killed in Nepal, a statistic credited to the use of drones, which can monitor parklands much faster and much more safely than ragners on foot.
A program called SMART (for spatial monitoring and reporting tool) aggregates poaching tips and footage from drones and camera traps to create animal and poacher patterns.
New technology called DNA fingerprinting is making it possible to collect DNA from animal feces and seized animal products, which investigators use to help map the path of trafficked items.
,
Read More »