Difference between revisions of "Ecosystem Approach Community of Practice: Validation"

From D4Science Wiki
Jump to: navigation, search
(Validation Log)
(Validation Log)
Line 141: Line 141:
 
|
 
|
 
|Not started
 
|Not started
 +
 +
 +
|}
  
 
=== 2012 Validation ===
 
=== 2012 Validation ===

Revision as of 15:11, 7 December 2012

Goal and Overview

According to the Description of Work, the conception and validation of concrete applications serving the business cases is part of the WP3 activity. The goal of the validation activity is to

  • document status and issues on harmonization approaches, and
  • validate implemented policies and business cases, their functionality and conformance to requirements.

The approach taken in iMarine builds on the experience gained in the validation efforts of the preceding D4ScienceAn e-Infrastructure operated by the D4Science.org initiative.-II project, where the FURPS model was adopted to generate templates against which to 'benchmark' the performance of a component.

FURPS is an acronym representing a model for classifying software quality attributes (functional and non-functional requirements):

  • Functionality - Feature set, Capabilities, Generality, Security
  • Usability - Human factors, Aesthetics, Consistency, Documentation
  • Reliability - Frequency/severity of failure, Recoverability, Predictability, Accuracy, Mean time to failure
  • Performance - Speed, Efficiency, Throughput, Response time
  • Supportability - Testability, Extensibility, Adaptability, Maintainability, Compatibility, Serviceability, Localizability

The validation in iMarine, where the commitments of the EA-CoPCommunity of Practice. reaches further then in the previous project, validation will be done on requirements and specifications produced by the EA-CoPCommunity of Practice., where these can be aligned with iMarine Board Work Plan objectives.

This implies that the validation is not equivalent to testing individual component behavior and performance in the e-InfrastructureAn operational combination of digital technologies (hardware and software), resources (data and services), communications (protocols, access rights and networks), and the people and organizational structures needed to support research efforts and collaboration in the large., but assesses to progress towards achieving a Board objective. This gives the validation an important role in one of the Board main activities priority setting to improve the sustainability of the e-InfrastructureAn operational combination of digital technologies (hardware and software), resources (data and services), communications (protocols, access rights and networks), and the people and organizational structures needed to support research efforts and collaboration in the large..

Validation thus focuses on project outputs, and does not intend to influence the underlying software architecture, development models, software paradigms or code base.

A validation is triggered by the release of a software component, part thereof, or data source that is accessible by project partners through a GUI in the e-InfrastructureAn operational combination of digital technologies (hardware and software), resources (data and services), communications (protocols, access rights and networks), and the people and organizational structures needed to support research efforts and collaboration in the large..

Procedure(s) and Supporting tools

The validation procedure

Once a VREVirtual Research Environment. is released, those Board Members that have expressed their interest in the component (through the Board Work Plan) are alerted by the WP3 Leader, and the validation commences with the compilation of a set of questions, based on the above FURPS criteria.

The feed-back is collected by the WP3 Leader, discussed with the technical Director or other project representatives, and, when necessary, enhancement tickets are produced, with a clear reference to the released component.

If the released VREVirtual Research Environment. is validated the first time, the Round 1 results are discussed with the validators and the EA-CoPCommunity of Practice. representatives they represent. If needed, the iMarine project is asked to add or edit modifications to the delivered functionality. If these modifications exceed the level of enhancement tickets, a Round 2 validation is foreseen, which can be followed by other rounds until a satisfactory level of completeness has been achieved.

This Validation approach is perfectly in line with the application development methodology that privileges the Evolutionary approach. With this approach a very robust but not functional-complete application is released to the validators and then constantly refined by adding new features and by assessing the validators’ comments and feedback. This evolutionary approach acknowledges that all the requirements are neither always completely defined by the community nor completely understood by the technological providers and therefore builds only those that are well understood and mature. This approach allows the technological providers to add features, to accommodate new requests, or to make changes that couldn't be conceived during the initial requirements phase by accepting the fact that an innovative VREVirtual Research Environment. must evolve through use in its intended operational environment.

The reporting on the wiki and in other tools

Validation results will also be documented in this wiki, which will contain the name of the component validated, the dates of validation, the results. Tickets will refer to the component name, the page where to validator comment originated, a descriptive text, and links to relevant documentation on the iMarine Board work plan and the validation wiki page.

In addition, when validation results in an error, specific 'Issue tickets' can be entered in the project TRAC system. If a validator encouters a functionality, usability, or performance issue, specific 'enhancement tickets' can be produced in liaison with the WP3 Leader. Supportability issues are usually discussed in other fora.

Validation Results: Overall Summary

November 2012 Validation Round

In one sentence, the September 2012 status of the validation reveals that the delivered Virtual Research Environments meet the expectations.

As expected, the bulk of the work in Period 1 was towards establishing data access and discovery functionality, with less emphasis on user facing VREVirtual Research Environment. components. Hence, the validation in P1 was focused on those VREVirtual Research Environment.'s that already could be equipped with data access components.

The VREVirtual Research Environment.'s build on a complete e-infrastructure, which is not subject to EA-CoPCommunity of Practice. validation. The entire validation effort never revealed any issues related to the architecture, design, or operation of the underlying components. These components e.g. service the data streams, storage, user access and management, security etc. These functionalities are not validated, but since no issues were reported, they can be considered to have received silent approval of the EA-CoPCommunity of Practice.. Some comments from the validation are worth mentioning here:

  • The improvements to the work-space are impressive, and allow for much better collaboration on data and results;
  • The messaging system vastly reduces the effort to share large dataset, or simple messages;
  • The integration option with the desktop was appreciated;
  • The gCubeApps concept was considered an important step to attract 'light' users (data consumers);
  • The live access and search of the biodiversity data was considered very useful, and was recommended to be made available at generic level.

iMarine technologies serve a plethora of use-cases, and VREs may be re-used across business cases. The validation efforts recognize that the target often has to be a subset of the expected BC, and that a validation is not an atomic or isolated effort.

The validation efforts were spread out over the entire Period 1, and are continuing as per today. As per 20 November 2012, the following VREs functionality was validated successfully:

  • ICIS Data import, curation and storage;
  • ICIS TS manipulation, graphing, and use in R;
  • ICIS TS tot GIS, Mapviewer;
  • AquaMaps Map Generation and data manipulation;
  • AquaMaps GIS Viewer and geospatial product facilities;
  • FCPPS and VME-DB Reporting tools;
  • FCPPS and VME-DB Documents work-flow;
  • BiodiversityResearchEnvironment taxonomy data tools;
  • BiodiversityResearchEnvironment Occurrence data tools;
  • VTI Capture aggregator.

Other VREVirtual Research Environment.'s were only partially validated, and these reports will be added when completed.

Community tools, i.e. tools that are build in a EA-CoPCommunity of Practice. context, and later brought to the project as data providers or components for integration were not at a level to be validated at M11.

Validation Results: Detailed Reports

One page for each "item" / functional area that is target of the validation activity;

Due to the heterogeneity of the e-infrastructure, and the re-use and orthogonality of many components, validation need not be an exhaustive exercise for each VREVirtual Research Environment.; a component validated in one VREVirtual Research Environment. will not have to be validated when re-used in another. A validation report on e.g. VTI can only be understood if one has practical knowledge of the ICIS VREVirtual Research Environment..

This is reflected in the structure of the validation report that can seem skewed in (dis)favor of the earlier components, where validation resulted in effective and structural improvement to the benefit of other VREVirtual Research Environment.'s. The evolving e-ecosystem around ICIS, FCPPS, and AquaMaps are validated from these 3 key VREVirtual Research Environment.'s as starting point.

Validation Log

EA-CoPCommunity of Practice. VREVirtual Research Environment. Validation activities
VREVirtual Research Environment. Name VREVirtual Research Environment. Component Last activity Detail report Status
ICIS Curation 09.12 Validated
ICIS Code List Management 09.12 Validated
ICIS SPREAD 09.12 Not Validated
ICIS VTI 04.12 Detail report Validated
ICIS R-Integration 08.12 Validated
Reporting FCPPS 02.12 Validated
Reporting FishFinder 10.12 Not started


2012 Validation