People | Locations | Statistics |
---|---|---|
Tekkaya, A. Erman |
| |
Förster, Peter |
| |
Mudimu, George T. |
| |
Shibata, Lillian Marie |
| |
Talabbeydokhti, Nasser |
| |
Laffite, Ernesto Dante Rodriguez |
| |
Schöpke, Benito |
| |
Gobis, Anna |
| |
Alfares, Hesham K. |
| |
Münzel, Thomas |
| |
Joy, Gemini Velleringatt |
| |
Oubahman, Laila |
| |
Filali, Youssef |
| |
Philippi, Paula |
| |
George, Alinda |
| |
Lucia, Caterina De |
| |
Avril, Ludovic |
| |
Belachew, Zigyalew Gashaw |
| |
Kassens-Noor, Eva | Darmstadt |
|
Cho, Seongchul |
| |
Tonne, Cathryn |
| |
Hosseinlou, Farhad |
| |
Ganvit, Harsh |
| |
Schmitt, Konrad Erich Kork |
| |
Grimm, Daniel |
|
Bauer-Gottwein, Peter
Technical University of Denmark
in Cooperation with on an Cooperation-Score of 37%
Topics
Publications (12/12 displayed)
- 2023Mapping inland water bathymetry with Ground Penetrating Radar (GPR) on board Unmanned Aerial Systems (UASs)citations
- 2020The value of distributed high-resolution UAV-borne observations of water surface elevation for river management and hydrodynamic modelingcitations
- 2020Unmanned Aerial System (UAS) observations of water surface elevation in a small streamcitations
- 2018Unmanned aerial vehicle observations of water surface elevation and bathymetry in the cenotes and lagoons of the Yucatan Peninsula, Mexicocitations
- 2018Technical note: Bathymetry observations of inland water bodies using a tethered single-beam sonar controlled by an unmanned aerial vehiclecitations
- 2018Dataset Used In "Bathymetry Observations Of Inland Water Bodies Using A Tethered Single-Beam Sonar Controlled By An Unmanned Aerial Vehicle". Https://Doi.Org/10.5194/Hess-2017-625.
- 2017Measuring water level in rivers and lakes from lightweight Unmanned Aerial Vehiclescitations
- 2017Water level observations from Unmanned Aerial Vehicles for improving estimates of surface water-groundwater interactioncitations
- 2017Optimizing sensitivity of Unmanned Aerial System optical sensors for low zenith angles and cloudy conditions
- 2016Multi‐angular observations of vegetation indices from UAV cameras
- 2016Multi‐angular observations of vegetation indices from UAV cameras
- 2016Hyperspatial mapping of water, energy and carbon fluxes with Unmanned Aerial Vehicles
Places of action
Organizations | Location | People |
---|
document
Multi‐angular observations of vegetation indices from UAV cameras
Abstract
Unmanned aerial vehicles (UAVs) are found as an alternative to the classical manned aerial photogrammetry, which can be used to obtain environmental data or as a complementary solution to other methods (Nex and Remondino, 2014). Although UAVs have coverage limitations, they have better resolution compared to satellites and aircrafts, they are cheaper and easy to handle, providing data in a short period of time (Matese et al., 2015; Uysal, Toprak and Polat, 2015). Furthermore, they can be equipped with different types of payloads carrying various sensors such as a thermal and multispectral cameras (Berni et al., 2009), hyper spectral camera (Burkart et al., 2015) and photometric elevation mapping sensor (Shahbazi et al., 2015) among others. Therefore, UAVs can be used in many fields such as agriculture, forestry, archeology, architecture, environment and traffic monitoring (Nex and Remondino, 2014). In this study, the UAV used is a hexacopter s900 equipped with a Global Positioning System (GPS) and two cameras; a digital RGB photo camera and a multispectral camera (MCA), with a resolution of 5472 x 3648 pixels and 1280 x 1024 pixels, respectively. In terms of applications, traditional methods using vegetation indices from reflectance often assume Lambertian models (de Moura et al., 2015), where the light is reflected equally in all the directions (Mobley, 2014) and, therefore, multi‐angular reflectance is not considered. However, differences in directional scattering (anisotropy) can provide important data about biophysical behavior in vegetation such as leaf area index (LAI), leaf angular distribution (LAD), vegetation water content, nitrogen and chlorophyll content (Tagesson et al., 2015), canopy roughness and others (de Moura et al., 2015). The Bidirectional Reflectance Distribution Function (BRDF) describes the surface reflectance changes depending on viewing geometry, usually used to analyze remote sensing data from satellite, airborne and surface platforms. (Singh et al., 2016). BRDF observations can also be obtained with the MCA camera located in the UAV. Thus, the aim of this study is to capture multi‐angular observations in different forest locations (Sorø and Risø) in Denmark by flying the UAV over the area of interest. Since the payload has a fix position, the viewing angles obtained due to the Field of view (FOV) of the MCA camera can be exploited and the flight pattern simulates some goniometer positions. This approach allows to measure different azimuth and zenith angles according to the sun position and to acquire different characteristics of vegetation depending on a specific time and amount of light.
Topics
Search in FID move catalog