Filtern
Erscheinungsjahr
Dokumenttyp
- Dissertation (249)
- Masterarbeit (91)
- Ausgabe (Heft) zu einer Zeitschrift (84)
- Bachelorarbeit (45)
- Diplomarbeit (27)
- Wissenschaftlicher Artikel (22)
- Studienarbeit (11)
- Konferenzveröffentlichung (10)
- Habilitation (4)
- Sonstiges (2)
Sprache
- Englisch (547) (entfernen)
Schlagworte
- Pestizid (8)
- Pflanzenschutzmittel (6)
- Software Engineering (6)
- Internet of Things (5)
- Biodiversität (4)
- Bluetooth (4)
- Bodenchemie (4)
- Landwirtschaft (4)
- Semantic Web (4)
- ecotoxicology (4)
Institut
- Fachbereich 4 (116)
- Institut für Informatik (83)
- Fachbereich 7 (78)
- Institut für Wirtschafts- und Verwaltungsinformatik (53)
- Institut für Computervisualistik (52)
- Institute for Web Science and Technologies (50)
- Institut für Management (30)
- Institut für Integrierte Naturwissenschaften, Abt. Biologie (24)
- Institut für Umweltwissenschaften (23)
- Fachbereich 8 (20)
The Web contains some extremely valuable information; however, often poor quality, inaccurate, irrelevant or fraudulent information can also be found. With the increasing amount of data available, it is becoming more and more difficult to distinguish truth from speculation on the Web. One of the most, if not the most, important criterion used to evaluate data credibility is the information source, i.e., the data origin. Trust in the information source is a valuable currency users have to evaluate such data. Data popularity, recency (or the time of validity), reliability, or vagueness ascribed to the data may also help users to judge the validity and appropriateness of information sources. We call this knowledge derived from the data the provenance of the data. Provenance is an important aspect of the Web. It is essential in identifying the suitability, veracity, and reliability of information, and in deciding whether information is to be trusted, reused, or even integrated with other information sources. Therefore, models and frameworks for representing, managing, and using provenance in the realm of Semantic Web technologies and applications are critically required. This thesis highlights the benefits of the use of provenance in different Web applications and scenarios. In particular, it presents management frameworks for querying and reasoning in the Semantic Web with provenance, and presents a collection of Semantic Web tools that explore provenance information when ranking and updating caches of Web data. To begin, this thesis discusses a highly exible and generic approach to the treatment of provenance when querying RDF datasets. The approach re-uses existing RDF modeling possibilities in order to represent provenance. It extends SPARQL query processing in such a way that given a SPARQL query for data, one may request provenance without modifying it. The use of provenance within SPARQL queries helps users to understand how RDF facts arederived, i.e., it describes the data and the operations used to produce the derived facts. Turning to more expressive Semantic Web data models, an optimized algorithm for reasoning and debugging OWL ontologies with provenance is presented. Typical reasoning tasks over an expressive Description Logic (e.g., using tableau methods to perform consistency checking, instance checking, satisfiability checking, and so on) are in the worst case doubly exponential, and in practice are often likewise very expensive. With the algorithm described in this thesis, however, one can efficiently reason in OWL ontologies with provenance, i.e., provenance is efficiently combined and propagated within the reasoning process. Users can use the derived provenance information to judge the reliability of inferences and to find errors in the ontology. Next, this thesis tackles the problem of providing to Web users the right content at the right time. The challenge is to efficiently rank a stream of messages based on user preferences. Provenance is used to represent preferences, i.e., the user defines his preferences over the messages' popularity, recency, etc. This information is then aggregated to obtain a joint ranking. The aggregation problem is related to the problem of preference aggregation in Social Choice Theory. The traditional problem formulation of preference aggregation assumes a I fixed set of preference orders and a fixed set of domain elements (e.g. messages). This work, however, investigates how an aggregated preference order has to be updated when the domain is dynamic, i.e., the aggregation approach ranks messages 'on the y' as the message passes through the system. Consequently, this thesis presents computational approaches for online preference aggregation that handle the dynamic setting more efficiently than standard ones. Lastly, this thesis addresses the scenario of caching data from the Linked Open Data (LOD) cloud. Data on the LOD cloud changes frequently and applications relying on that data - by pre-fetching data from the Web and storing local copies of it in a cache - need to continually update their caches. In order to make best use of the resources (e.g., network bandwidth for fetching data, and computation time) available, it is vital to choose a good strategy to know when to fetch data from which data source. A strategy to cope with data changes is to check for provenance. Provenance information delivered by LOD sources can denote when the resource on the Web has been changed last. Linked Data applications can benefit from this piece of information since simply checking on it may help users decide which sources need to be updated. For this purpose, this work describes an investigation of the availability and reliability of provenance information in the Linked Data sources. Another strategy for capturing data changes is to exploit provenance in a time-dependent function. Such a function should measure the frequency of the changes of LOD sources. This work describes, therefore, an approach to the analysis of data dynamics, i.e., the analysis of the change behavior of Linked Data sources over time, followed by the investigation of different scheduling update strategies to keep local LOD caches up-to-date. This thesis aims to prove the importance and benefits of the use of provenance in different Web applications and scenarios. The exibility of the approaches presented, combined with their high scalability, make this thesis a possible building block for the Semantic Web proof layer cake - the layer of provenance knowledge.
Diese Arbeit schlägt die Benutzung von MSR (Mining Software Repositories) Techniken zum Identifizieren von Software Entwicklern mit exklusiver Fachkenntnis zu spezifischen APIs und Programmierfachgebieten in Software Repositories vor. Ein versuchsweises Tool zum finden solcher “Islands of Knowledge” in Node.js Projekten wird präsentiert und in einer Fallstudie auf 180 npm packages angewandt. Dabei zeigt sich, dass jedes package im Durchschnitt 2,3 Islands of Knowledge hat, was dadurch erklärbar sein könnte, dass npm packages dazu tendieren nur einen einzelnen Hauptcontributor zu haben. In einer Umfrage werden die Verantwortlichen von 50 packages kontaktiert und nach ihrer Meinung zu den Ergebnissen des Tools gefragt. Zusammen mit deren Antworten berichtet diese Arbeit von den Erfahrungen, die mit dem versuchsweisen Tool gemacht wurden, und wie zukünftige Weiterentwicklungen noch bessere Aussagen über die Verteilung von Programmierfachwissen in Entwicklerteams machen könnten.
With global and distributed project teams being increasingly common Collaborative Project Management is becoming the prevalent paradigm for the work in most organisations. Software has for many years been one of the most used tools for supporting Project Management and with the focus on Collaborative Project Management and accompanied by the emergence of Enterprise Collaboration Systems (ECS), Collaborative Project Management Software (CPMS) is gaining increased attention. This thesis examines the capabilities of CPMS for the long-term management of information which not only includes the management of files within these systems, but the management of all types of digital business documents, particularly social business documents. Previous research shows that social content in collaboration software is often poorly managed which poses challenges to meeting performance and conformance objectives in a business. Based on literature research, requirements for the long-term management of information in CPMS are defined and 7 CPMS tools are analysed regarding the content they contain and the functionalities for the long-term management of this content they offer. The study shows that CPMS by and large are not able to meet the long-term information management needs of an organisation on their own and that only the tools geared towards enterprise customers have sufficient capabilities to support the implementation of an Enterprise Information Management strategy.
Das Internet der Dinge (IoT) ist ein Netzwerk bestehend aus adressierbaren, physikalischen Objekten, die Sensor-, Kommunikations- und Aktuator-Technologien bereitstellen und mit ihrer Umwelt interagieren (Geschickter 2015). Wie jedes neue Konzept, hat auch IoT Interesse über jeden Anwendungsbereich hinweg, sowohl in Theorie als auch Praxis, geweckt und die verfügbaren Technologien an ihre Grenzen gebracht. Diese Grenzen machen sich insbesondere dann bemerkbar, wenn die Anzahl von Dingen (Things), die über verschiedenste Anwendungsbereiche hinweg verwaltet werden müssen, steigt. Um die neuartigen Anforderungen zu erfüllen, wurde eine Fülle von verschiedenen Systemen entwickelt, die alle ihre eigenen Interpretationen einer IoT Architektur und ihrer jeweiligen Komponenten anwenden. Dies hat dazu geführt, dass IoT aktuell eher ein Intranet der Dinge als ein Internet der Dinge ist (Zorzi et al. 2010). Daher ist es Ziel dieser Arbeit, ein einheitliches Verständnis der Komponenten, die eine IoT Architektur bilden, zu erlangen und generische Spezifikationen in Form eines Ganzheitlichen IoT Architektur Frameworks zur Verfügung zu stellen.
Diese Arbeit verwendet Design Science Research (DSR), um die genannte Architektur auf Basis der einschlägigen Literatur zu entwickeln. Die Entwicklung des Ganzheitlichen IoT Architektur Frameworks umfasst die Nutzung zwei neuer Perspektiven auf IoT Architekturen (IoT Architecture Perspectives), die während der Analyse von IoT Architekturen in der Literatur identifiziert wurden. Die Anwendung dieser neuen Perspektiven führte zur Erkenntnis, dass eine weitere, ebenfalls neuartige, Komponente in der Literatur implizit erwähnt wird. Die Beschreibungen der Komponenten von verschiedenen IoT Architekturen wurden vereinheitlicht und mit der neuen Komponente, dem Thing Management System, in Beziehung gesetzt, um das Ganzheitliche IoT Architektur Framework zu entwickeln. Weiterhin wurde gezeigt, dass die Spezifikationen der Architektur als Vorlage für die Implementation eines Prototypen geeignet ist.
Der Hauptbeitrag dieser Arbeit ist ein vereinheitlichtes Verständnis der einzelnen Komponenten sowie deren Interaktionen einer IoT Architektur.
Pelagic oxyclines, the transition zone between oxygen rich surface waters and oxygen depleted deep waters, are a common characteristic of eutrophic lakes during summer stratification. They can have tremendous effects on the biodiversity and the ecosystem functioning of lakes and, to add insult to injury, are expected to become more frequent and more pronounced as climate warming progresses. On these grounds, this thesis endeavors to advance the understanding of formation, persistence, and consequences of pelagic oxyclines: We test, whether the formation of metalimnetic oxygen minima is intrinsically tied to a locally enhanced oxygen consuming process, investigate the relative importance of vertical physical oxygen transport and biochemical oxygen consumption for the persistence of pelagic oxyclines, and finally assess their potential consequences for whole lake cycling. To pursue these objectives, the present thesis nearly exclusively resorts to in situ measurements. Field campaigns were conducted at three lakes in Germany featuring different types of oxyclines and resolved either a short (hours to days) or a long (weeks to months) time scale. Measurements comprised temperature, current velocity, and concentrations of oxygen and reduced substances in high temporal and vertical resolution. Additionally, vertical transport was estimated by applying the eddy correlation technique within the pelagic region for the first time. The thesis revealed, that the formation of metalimnetic oxygen minima does not necessarily depend on locally enhanced oxygen depletion, but can solely result from gradients and curvatures of oxygen concentration and depletion and their relative position to each other. Physical oxygen transport was found to be relevant for oxycline persistence when it considerably postponed anoxia on a long time scale. However, its influence on oxygen dynamics was minor on short time scales, although mixing and transport were highly variable. Biochemical consumption always dominated the fate of oxygen in pelagic oxyclines. It was primarily determined by the oxidative breakdown of organic matter originating from the epilimnion, whereas in meromictic lakes, the oxidation of reduced substances dominated. Beyond that, the results of the thesis emphasize that pelagic oxyclines can be a hotspot of mineralization and, hence, short-circuit carbon and nutrient cycling in the upper part of the water column. Overall, the present thesis highlights the importance of considering physical transport as well as biochemical cycling in future studies.
Koordinations- und Bewusstseinsmechanismen sind in Computer-Supported Cooperative Work (CSCW) und bei traditioneller Groupware von Wichtigkeit. Die Wissenschaft ist bestrebt, deren Bedeutung bei der Nutzung von Groupware und die damit verknüpfte Zusammenarbeit von Menschen tiefgründig zu untersuchen, um ihre Anwendung und Effizienz zu beschreiben. Dabei wurde bisher noch keine Klassifizierung der Mechanismen vorgenommen, um deren Gemeinsamkeiten und Unterschiede sowie ihre Anwendung herauszuarbeiten und ihrer Bedeutung im kollaborativem Umfeld nachzugehen. Zudem fehlt die Betrachtung der Mechanismen in neuen Formen von Groupware. In der Wissenschaft als auch in der Praxis haben Enterprise Collaboration Systems (ECS), die Social Software Funktionalität beinhalten, wachsende Bedeutung. Basierend auf der Kombination von traditioneller Groupware und Social Software Komponenten beinhalten diese auch Mechanismen, die die Kollaboration vereinfachen sollen, jedoch bisher noch nicht hinreichend untersucht wurde.
Das Ziel dieser wissenschaftlichen Arbeit ist es daher, Beispiele für Koordinierungs- und Bewusstseinsmechanismen in der akademischen Literatur zu identifizieren um einen ersten Überblick über diese zu verschaffen. Aufbauend darauf ist es zudem Ziel, die Beispielmechanismen zu klassifizieren. Basierend auf einer Literaturanalyse werden Konzepte aus der Literatur übernommen und auf die ausgewählten Mechanismen angewendet um diese zu analysieren und zu klassifizieren. Dabei werden die Gemeinsamkeiten und Unterschiede der Mechanismen herauszuarbeiten und beschrieben. Um ein Verständnis für die Anwendung von Koordinations- und Bewusstseinsmechanismen zu verdeutlichen, werden einige Mechanismen exemplarisch visualisiert. Die Beispiele beziehen sich auf die verschiedenen Klassifizierungsgruppen. Die Auswahl der Mechanismen für die Visualisierung basiert auf deren signifikanten Unterschiede in ihrer Funktionalität. Anschließend werden die ausgewählten Mechanismen, die in der Literatur traditioneller Groupware identifiziert wurden, in kleinen Ausmaß in sozial integrierter ECS kon-trollier. Dabei gilt es herauszufinden, ob die Beispielmechanismen vorzufinden sind und ob neue Mechanismen identifiziert werden können. Als Praxisbeispiel von ECS mit Sozialer Software dient die kollaborative Plattform von IBM Connections. IBM Connections wird an der Universität Koblenz eingesetzt, um die Plattform „UniConnect“ zu betreiben. Anhand einer ersten Toolanalyse wird herausgearbeitet, welche von den identifizierten Beispielen an Mechanismen in IBM Connections angewendet werden. Diese Arbeit stellt erste Schritte in der Untersuchung von Koordinierungs- und Bewusstseinsmechanismen in ECS mit Social Software dar. Darüber hinaus sollen Beispiele für neue, bisher unbekannte Mechanismen herausgearbeitet werden, die im Zuge des sozialen Faktors zu kollaborativen Arbeit eingesetzt werden.
Der Beitrag soll dazu dienen, Beispiele von Koordinierungs- und Bewusstseinsmechanismen in der Literatur zu identifizieren, zu analysieren und diese zusammenbringen um einen ersten Überblick zu erhalten. Desweiten wird eine erste Klassifizierung anhand der Unterschiedlichkeiten der Mechanismen vorgenommen. Nebenbei soll der Betrag einen Anreiz für weitere Untersuchungen schaffen, Koordinierungs- und Bewusstseinsmechanismen in sozial integrierter ECS tiefer zu untersuchen.
World’s ecosystems are under great pressure satisfying anthropogenic demands, with freshwaters being of central importance. The Millennium Ecosystem Assessment has identified anthropogenic land use and associated stressors as main drivers in jeopardizing stream ecosystem functions and the
biodiversity supported by freshwaters. Adverse effects on the biodiversity of freshwater organisms, such as macroinvertebrates, may propagate to fundamental ecosystem functions, such as organic matter breakdown (OMB) with potentially severe consequences for ecosystem services. In order to adequately protect and preserve freshwater ecosystems, investigations regarding potential and observed as well as direct and indirect effects of anthropogenic land use and associated stressors (e.g. nutrients, pesticides or heavy metals) on ecosystem functioning and stream biodiversity are needed. While greater species diversity most likely benefits ecosystem functions, the direction and magnitude of changes in ecosystem functioning depends primarily on species functional traits. In this context, the functional diversity of stream organisms has been suggested to be a more suitable predictor of changes in ecosystem functions than taxonomic diversity.
The thesis aims at investigating effects of anthropogenic land use on (i) three ecosystem functions by anthropogenic toxicants to identify effect thresholds (chapter 2), (ii) the organic matter breakdown by three land use categories to identify effects on the functional level (chapter 3) and (iii)on the stream community along an established land-use gradient to identify effects on the community level.
In chapter 2, I reviewed the literature regarding pesticide and heavy metal effects on OMB, primary production and community respiration. From each reviewed study that met inclusion criteria, the toxicant concentration resulting in a reduction of at least 20% in an ecosystem function was standardized based on laboratory toxicity data. Effect thresholds were based on the relationship between ecosystem functions and standardized concentration-effect relationships. The analysis revealed that more than one third of pesticide observations indicated reductions in ecosystem functions at concentrations that are assumed being protective in regulation. However, high variation within and between studies hampered the derivation of a concentration-effect relationship and thus effect thresholds.
In chapter 3, I conducted a field study to determine the microbial and invertebrate-mediated OMB by deploying fine and coarse mesh leaf bags in streams with forested, agricultural, vinicultural
and urban riparian land use. Additionally, physicochemical, geographical and habitat parameters were monitored to explain potential differences in OMB among land use types and sites. Regarding results, only microbial OMB differed between land use types. The microbial OMB showed a negative relationship with pH while the invertebrate-mediated OMB was positively related to tree cover. OMB responded to stressor gradients rather than directly to land use.
In chapter 4, macroinvertebrates were sampled in concert with leaf bag deployment and after species identification (i) the taxonomic diversity in terms of Simpson diversity and total taxonomic
richness (TTR) and (ii) the functional diversity in terms of bio-ecological traits and Rao’s quadratic entropy was determined for each community. Additionally, a land-use gradient was established and the response of the taxonomic and functional diversity of invertebrate communities along this gradient was investigated to examine whether these two metrics of biodiversity are predictive for the rate of OMB. Neither bio-ecological traits nor the functional diversity showed a significant relationship with
OMB. Although, TTR decreased with increasing anthropogenic stress and also the community structure and 26 % of bio-ecological traits were significantly related to the stress gradient, any of these shifts propagated to OMB.
Our results show that the complexity of real-world situations in freshwater ecosystems impedes the effect assessment of chemicals and land use for functional endpoints, and consequently our potential to predict changes. We conclude that current safety factors used in chemical risk assessment may not be sufficient for pesticides to protect functional endpoints. Furthermore, simplifying real-world stressor gradients into few land use categories was unsuitable to predict and quantify losses in OMB. Thus, the monitoring of specific stressors may be more relevant than crude land use categories to detect effects on ecosystem functions. This may, however, limit the large scale assessment of the status of OMB. Finally, despite several functional changes in the communities the functional diversity over several trait modalities remained similar. Neither taxonomic nor functional diversity were suitable predictors of OMB. Thus, when understanding anthropogenic impacts on the linkage between biodiversity and ecosystem functioning is of main interest, focusing on diversity metrics that are clearly linked to the stressor in question (Jackson et al. 2016) or integrating taxonomic and functional metrics (Mondy et al., 2012) might enhance our predictive capacity.
Key mechanisms for the release of metal(loid)s from a construction material in hydraulic engineering
(2017)
Hydraulic engineering and thus construction materials are necessary to enable the navigability of water ways. Since, a variety of natural as well as artificial materials are used, this materials are world wide tested on a potential release of dangerous substances to prevent adverse effects on the environment. To determine the potential release, it is important to identify and to understand key mechanisms which are decisive for the release of hazardous substances. A limited correlation between the conditions used in regulatory tests and those found in environmental systems is given and hence, often the significance of results from standardised tests on construction materials is questioned, since they are not designed to mimic environmental conditions.
In Germany industrial by-products are used as armour stones in hydraulic engineering. Especially the by-product copper slag is used during the last 40 years for the construction of embankments, groynes and coastal protection. On the one hand, this material has a high density and natural resources (landscape) are protected. One the other hand, the material contains high quantities of metal(loid)s. Therefore the copper slag (product name: iron silicate stones) is very suitable as test material. Metal(loid)s examined were As, Sb and Mo as representatives for (hydr)oxide forming elements and Cd, Co, Cu, Fe, Ni, Pb and Zn were studied as representatives for elements forming cations during the release.
Questions addressed in this Thesis were: (i) can we transfer the results from batch experiments to construction scenarios under the prevalent environmental conditions, (ii) which long-term trends exist for the release of metal(loid)s from copper slags and (iii) how environmental conditions influence the leaching of metal(loid)s from water construction materials?
To answer the first question the surface depending release of the metal(loid)s from the construction materials was examined. Therefore, batch leaching experiments with different particle sizes and a constant liquid/solid ratio were performed. In a second step a comparison between different methods for the determination of the specific surface area of armour stones with a 3D laser scanning method as a reference were performed. In a last step it was possible to show that via a roughness factor the results of the specific surface area from small stones, measured with gas adsorption, can be connected with the results from armour stones, determined with an aluminium foil method. Based on calculations of the specific surface area, it was possible to significantly improve catchment scale calculation about the release of metal(loid)s and to evaluate a potential impact of construction materials in hydraulic engineering on the water chemistry of rivers and streams.
To answer the second question long-term leaching diffuse gradient in thin films supported experiments were performed for half a year. Diffuse gradients in thin films (DGT) is an in situ method to passive sample metal(loid)s in water, sediments and soils. They were used as a sink for metal(loid)s in the eluate to provide solution equilibriums. Thus the exchange of the eluent, which is performed normally in long-term experiments, was superfluous and long-term effects under undisturbed conditions were studied. The long-term leaching experiments with DGT have proven to be capable (i) to differentiate between the depletion of the material surface and the solution equilibriums and (ii) to study sorption processes with or without a further release of the analytes. This means for the practically relevant test material copper slag that: (i) the cations Cd, Co, Cu, Ni and Pb are confirmed to be released from the slag over the whole time period of six months, (ii) a surface depletion of Zn was detected, and (iii) that the (hydr)oxide forming elements As, Mo and Sb were released from the slag over the hole periods of six months but the release was masked by adsorption to Fe-oxide colloids, which were formed during the leaching experiments. It was confirmed, that sulphide minerals are the main source for long-term release of Cd, Cu, Ni, Pb and Mo.
To answer the third question short-term leaching experiments simulating environmental conditions in hydraulic engineering were performed. One factor is the salinity. The influence of this parameter was tested in batch experiments with sea salt solution (30 g/l), river Rhine water, ultra pure water and in addition with different NaCl concentration (5, 10, 20 and 30 g/l). In general, the ionic strength is an important factor for the metal(loid) release but the composition of the water (e.g. the HCO3- content) may superimpose this effect. Therefore, the concentrations of the metal(loid)s in the experiments with ultra-pure water spiked with sea salt or native river water and the ultra-pure water spiked with NaCl were significantly different. In a second experiment the influence of the environmental parameters and the interactions between the environmental parameters pH (4–10), sediment content (0 g–3.75 g), temperature (4 °C–36 °C) and ionic strength (0 g/l–30 g/l NaCl) on the release of metal(loid)s from the test material was examined. The statistical Design of Experiments (DoE) was used to study the influence of these factors as well as their interactions. All studied factors may impact the release of metal(loid)s from the test material to the eluent, whereas the release and the partitioning between sediment and eluate of metal(loid)s was impacted by interactions between the studied factors. The main processes were sorption, complexation, solubility, buffering and ion exchange. In addition, by separating the sediment from the slag after the experiments by magnetic separation, the enrichment of metal(loid)s in the sediment was visible. Thus, the sediment was the most important factor for the release of the metal(loid)s, via pH, temperature and ionic strength, because the sediment acted as a sink.
Statistical eco(-toxico)logy
(2017)
Freshwaters are of immense importance for human well-being.
Nevertheless, they are currently facing unprecedented levels of threat from habitat loss and degradation, overexploitation, invasive species and
pollution.
To prevent risks to aquatic ecosystems, chemical substances, like agricultural pesticides, have to pass environmental risk assessment (ERA) before entering the market.
Concurrently, large-scale environmental monitoring is used for surveillance of biological and chemical conditions in freshwaters.
This thesis examines statistical methods currently used in ERA.
Moreover, it presents a national-scale compilation of chemical monitoring data, an analysis of drivers and dynamics of chemical pollution in streams and, provides a large-scale risk assessment by combination with results from ERA.
Additionally, software tools have been developed to integrate different datasets used in ERA.
The thesis starts with a brief introduction to ERA and environmental monitoring and gives an overview of the objectives of the thesis.
Chapter 2 addresses experimental setups and their statistical analyses using simulations.
The results show that current designs exhibit unacceptably low statistical power, that statistical methods chosen to fit the type of data provide higher power and that statistical practices in ERA need to be revised.
In chapter 3 we compiled all available pesticide monitoring data from Germany.
Hereby, we focused on small streams, similar to those considered in ERA and used threshold concentrations derived during ERA for a large-scale assessment of threats to freshwaters from pesticides.
This compilation resulted in the most comprehensive dataset on pesticide exposure currently available for Germany.
Using state-of-the-art statistical techniques, that explicitly take the limits of quantification into account, we demonstrate that 25% of small streams are at threat from pesticides.
In particular neonicotinoid pesticides are responsible for these threats.
These are associated with agricultural intensity and can be detected even at low levels of agricultural use.
Moreover, our results indicated that current monitoring underestimates pesticide risks, because of a sampling decoupled from precipitation events.
Additionally, we provide a first large-scale study of annual pesticide exposure dynamics.
Chapters 4 and 5 describe software solutions to simplify and accelerate the integration of data from ERA, environmental monitoring and ecotoxicology that is indispensable for the development of landscape-level risk assessment.
Overall, this thesis contributes to the emerging discipline of statistical ecotoxicology and shows that pesticides pose a large-scale threat to small streams.
Environmental monitoring can provide a post-authorisation feedback to ERA.
However, to protect freshwater ecosystems ERA and environmental monitoring need to be further refined and we provide software solutions to utilise existing data for this purpose.
Agricultural land-use may lead to brief pulse exposures of pesticides in edge-of-field streams, potentially resulting in adverse effects on aquatic macrophytes, invertebrates and ecosystem functions. The higher tier risk assessment is mainly based on pond mesocosms which are not designed to mimic stream-typical conditions. Relatively little is known on exposure and effect assessment using stream mesocosms.
Thus the present thesis evaluates the appliacability of the stream mesocosms to mimic stream-typical pulse exposures, to assess resulting effects on flora and fauna and to evaluate aquatic-terrestrial food web coupling. The first objective was to mimic stream-typical pulse exposure scenarios with different durations (≤ 1 to ≥ 24 hours). These exposure scenarios established using a fluorescence tracer were the methodological basis for the effect assessment of an herbicide and an insecticide. In order to evaluate the applicability of stream mesocosms for regulatory purposes, the second objective was to assess effects on two aquatic macrophytes following a 24-h pulse exposure with the herbicide iofensulfuron-sodium (1, 3, 10 and 30 µg/L; n = 3). Growth inhibition of up to 66 and 45% was observed for the total shoot length of Myriophyllum spicatum and Elodea canadensis, respectively. Recovery of this endpoint could be demonstrated within 42 days for both macrophytes. The third objective was to assess effects on structural and functional endpoints following a 6-h pulse exposure of the pyrethroid ether etofenprox (0.05, 0.5 and 5 µg/L; n = 4). The most sensitive structural (abundance of Cloeon simile) and functional (feeding rates of Asellus aquaticus) endpoint revealed significant effects at 0.05 µg/L etofenprox. This concentration was below field-measured etofenprox concentrations and thus suggests that pulse exposures adversely affect invertebrate populations and ecosystem functions in streams. Such pollutions of streams may also result in decreased emergence of aquatic insects and potentially lead to an insect-mediated transfer of pollutants to adjacent food webs. Test systems capable to assess aquatic-terrestrial effects are not yet integrated in mesocosm approaches but might be of interest for substances with bioaccumulation potential. Here, the fourth part provides an aquatic-terrestrial model ecosystem capable to assess cross-ecosystem effects. Information on the riparian food web such as the contribution of aquatic (up to 71%) and terrestrial (up to 29%) insect prey to the diet of the riparian spider Tetragnatha extensa was assessed via stable isotope ratios (δ13C and δ15N). Thus, the present thesis provides the methodological basis to assess aquatic-terrestrial pollutant transfer and effects on the riparian food web.
Overall the results of this thesis indicate, that stream mesocosms can be used to mimic stream-typical pulse exposures of pesticides, to assess resulting effects on macrophytes and invertebrates within prospective environmental risk assessment (ERA) and to evaluate changes in riparian food webs.