Proposals

Eclipse SWTImageJ

Wednesday, March 20, 2024 - 09:31 by Marcel Austenfeld

Introduction:

ImageJ is a versatile platform for image analysis, offering a broad range of functionalities through its scripting capabilities and extensive plugin ecosystem in the scientific domain. However, its reliance on AWT for the GUI presents limitations in terms of Graphical User Interface, platform compatibility, and user experience. Porting ImageJ to SWT,  an efficient modern Java GUI toolkit, can address these limitations and unlock new opportunities for plugin developers and users of scientific image applications.

Objective:

The primary objective of this proposal is to port ImageJ to SWT to enhance its performance, usability, and integration within the Eclipse IDE (a plugin using SWT/AWT already exists).

By leveraging the native capabilities of SWT, we aim to provide users with a more responsive and intuitive interface while ensuring compatibility with Eclipse's development environment.

The original software file structure (organization of the classfiles, sourcecode in packages and plugins, macros) should be preserved too, to ensure that existing ImageJ developers feel comfortable and motivated to improve and use the ported application.

Proposal:

1. SWT Integration: Polishing the rewrite of the graphical user interface of ImageJ using SWT, leveraging its native components and rendering capabilities. This will ensure better performance and a more consistent user experience across different operating systems.

2. Native Look and Feel: Utilize SWT's support for native widgets to ensure that ImageJ maintains the look and feel of the host operating system, enhancing its integration with the overall desktop environment.

3. Future Improved Performance: Take advantage of SWT's architecture and optimized event handling to improve the performance of ImageJ, especially when dealing with large datasets and complex image processing tasks.

4. Eclipse Integration: Ensure seamless integration of the SWT-based ImageJ with the Eclipse IDE, allowing users to leverage Eclipse's features for project management, version control, and collaborative development.

5. Compatibility and Accessibility: Ensure backward compatibility where it is possible with existing ImageJ macros,  plugins and functionalities while providing accessibility features to accommodate users with diverse needs and preferences.

6. User Experience Enhancements: Implement usability improvements and user interface enhancements to streamline common workflows and make ImageJ more intuitive and user-friendly.

7. Community Engagement: Engage with the ImageJ community to gather feedback, address concerns, and ensure that the ported version meets the needs and expectations of users and developers.

Expected Outcomes:

1. Enhanced performance and responsiveness of the ImageJ GUI through SWT integration.

2. Improved usability and user experience, leading to increased adoption and satisfaction among users.

3. Seamless integration with Eclipse, facilitating a more productive development workflow for researchers and practitioners.

4. Continued compatibility with existing macros plugins and functionalities where it is possible, ensuring a smooth transition for current ImageJ users.

Conclusion:

Porting ImageJ to SWT offers significant benefits in terms of performance, usability, and integration within the Eclipse IDE. By undertaking this porting effort, we aim to modernize ImageJ's graphical interface, improve its performance, and enhance its usability, ultimately empowering users to achieve more in their image analysis tasks.

Budget:

The porting budget has already been covered by the Lablicate GmbH for the development effort required for porting ImageJ to SWT, including software development, testing, documentation, and community engagement activities. Additional resources may be allocated for ongoing maintenance and support to ensure the long-term sustainability of the ported version.

 

References:

Rasband, W.S., ImageJ, U. S. National Institutes of Health, Bethesda, Maryland, USA, https://imagej.net/ij/, 1997-2018.

Schneider, C.A., Rasband, W.S., Eliceiri, K.W. "NIH Image to ImageJ: 25 years of image analysis". Nature Methods 9, 671-675, 2012. (This article is available online.)

Abramoff, M.D., Magalhaes, P.J., Ram, S.J. "Image Processing with ImageJ". Biophotonics International, volume 11, issue 7, pp. 36-42, 2004. (This article is available as a PDF.)

 

Eclipse Dataspace Decentralized Claims Protocol

Thursday, March 7, 2024 - 16:00 by James Marino

Technical Details

DCP defines the following protocol flows.

1. Base Identity Protocol (BIP)

The *Base Identity Protocol* defines how to obtain and communicate participant identities and claims using self-issued security tokens. BIP defines:

- A format for self-issued tokens based on the Decentralized Identifiers (DIDs) v1.0
(https://www.w3.org/TR/did-core, did:web Method (https://w3c-ccg.github.io/did-method-web/) and Self-Issued OpenID Provider v2 (https://openid.net/specs/openid-connect-self-issued-v2-1_0.html) specifications. 

- Endpoints and a flow to obtain self-issued security tokens.

2. Verifiable Presentation Protocol (VPP)

The *Verifiable Presentation Protocol*  defines a protocol for storing and presenting Verifiable Credentials (VC) and other identity-related
resources. The Verifiable Presentation Protocol (VPP) covers the following aspects:

- Endpoints and message types for storing identity resources belonging to a holder
- Endpoints and message types for resolving identity resources
- Secure token exchange for restricting access to identity resource endpoints

The VPP makes use of the following standards (among others):

- Verifiable Credentials Data Model v1.1 (https://www.w3.org/TR/vc-data-model/)
- DIF Presentation Exchange (https://identity.foundation/presentation-exchange/spec/v2.0.0).
- Jason Web Token (JWT) (https://www.rfc-editor.org/info/rfc7519)

The VPP is designed to make integrating existing software wallets and identity systems easy. 

3. Credential Issuance Protocol (CIP)

Verifiable Credentials enable a holder to present claims directly to a Relying Party (RP) without
the involvement or knowledge of the `Credential Issuer`. The *Credential Issuance Protocol* (CIP) provides an interoperable mechanism for parties (potential holders) to request credentials from a `Credential Issuer.` Specifically:

- Formats and profiles for verifiable credentials based on 
- The protocol defines the endpoints and message types for requesting credentials to be issued from a `Credential Issuer.`
- The protocol is designed to handle use cases where credentials can automatically be issued and a manual workflow is required. 

Use of Relevant Standards

DCP is based on relevant existing standards where possible. See the above description for specific examples. 

Integration with Identity-related projects

DCP is designed to integrate existing wallet and identity systems. For example, Catena-X has integrated [Keycloak](https://www.keycloak.org/) with the Verifiable Presentation Protocol as a token issuer.  Existing wallets and credential issuance systems can support VPP and CIP endpoints.

Open Source Implementations and Industry Adoption

Two open-source Eclipse projects are currently implementing the specifications, the EDC Identity Hub (https://github.com/eclipse-edc/IdentityHub) and the Tractus-X Managed Identity Wallet (https://github.com/eclipse-tractusx/managed-identity-wallet). 

A TCK is currently planned and will be based on the [Dataspace TCK Framework](https://github.com/eclipse-dataspacetck)

Dataspaces that use DCP

Eona-X | EONA-X is a dataspace in the domain of Mobility, Transport and Tourism. It leverages EDC capabilities to power data exchanges between its participants.

Contact: phebant[@]amadeus[.]com

Catena-X | Catena-X is offering the first open and collaborative data space for the automotive industry to boost business processes using data-driven value chains.

Contact: info[@]catena-x[.]net
 

Eclipse Zenoh-Flow

Tuesday, February 20, 2024 - 08:50 by Julien Loudet

Eclipse Zenoh-Flow aims at simplifying and structuring (i) the declaration, (ii) the deployment and (iii) the writing of complex and, potentially, safety-critical applications that can span from the Cloud to the Microcontroller.

To these ends, Eclipse Zenoh-Flow leverages the data flow programming model — where applications are viewed as a directed graph of computing units, and Eclipse Zenoh — an Edge-native, data-centric, location transparent, communication middleware. This makes for a powerful combination as Zenoh offers flexibility and extensibility while data flow programming structures computations. The main benefit of this approach is that this allows us to separate the applications from the underlying infrastructure: data are published and subscribed to (automatically with Zenoh-Flow) without the need to know where they are actually located.

Eclipse Zenoh-Flow further separates the business logic contained within each computing unit from the constraints of the overall application. An application is described in a descriptor file that groups together the different blocks that compose it as well as (i) how these blocks are connected, (ii) how they should be deployed on the infrastructure and (iii) possible execution constraints (e.g. time requirements or execution order).

As Eclipse Zenoh-Flow is also targeting safety-critical applications, it leverages the Rust programming language for its core. It also offers bindings in other languages to facilitate integration with existing software.

Eclipse Dataspace Protocol

Friday, February 16, 2024 - 03:05 by Sebastian Steinbuss

The Eclipse Dataspace Protocol is used in the context of data spaces as described and defined in the subsequent sections with the purpose to support interoperability. In this context, the specification provides fundamental technical interoperability for participants in data spaces and therefore the protocol specified here is required to join any data space as specified here. Beyond the technical interoperability measures described in this specification, semantic interoperability should also be addressed by the participants. From the perspective of the data space, interoperability needs to be addressed also on the level of trust, on organizational level and on legal level. The aspect of cross data space communication is not subject of this document, as this is addressed by the data spaces' organizational and legal agreements.

The interaction of participants in a data space is conducted by the participant agents, so-called Connectors, which implement the protocols described above. While most interactions take place between Connectors, some interactions with other systems are required. The figure below provides an overview on the context of this specification.

An Identity Provider realizes the required interfaces and provides required information to implement Trust Framework of a data space. The validation of the identity of a given participant agent and the validation of additional claims is the fundamental mechanism. The structure and content of such claims and identity may vary between different data spaces, as well as the structure of such an Identity Provider, e.g. a centralized system, a decentralized system or a federated system.

A connector will implement additional internal functionalities, like monitoring or Policy Engines, as appropriate. It is not covered by this specification, if a connector implements such or how.

The same applies for the data, which is transferred between the systems. While this document does not define the transport protocol, the structure, syntax and semantics of the data, a specification for those aspects is required and subject to the agreements of the participants or the data space.

Eclipse Conformity Assessment Policy and Credential Profile

Monday, February 12, 2024 - 09:16 by Pierre Gronlier

This proposal contributes to implements an end-to-end holistic approach to seamlessly integrate, anchor, and enforce negotiated data exchange agreements throughout the underlying infrastructure, encompassing processes related to data processing, storage, and transfer.

Conformity assessment is one of the industry answers to handle risk management, yet there is no commonly accepted specification for interoperable conformity assessment.

This specification goal is to combine the existing world of conformity assessment and the needs for a Provider and a Consumer to reach an agreement based on a common understanding using existing claims and evidence.

Examples:

  • How can Provider express that an unknown Consumer shall only process data on ISO 27001 certified services ?
  • How does the Consumer demonstrate such requirement ?

 

To demonstrate the viability of the current proposal, Gaia-X used this specification to build the Gaia-X Compliance and an open-source implementation of the elements described in the Scope section has been done https://gitlab.com/gaia-x/lab/compliance

The Gaia-X compliance and its implementation based on the current specification proposal is being used by several dataspaces such as Catena-X, Eona-X, Prometheus-X, Agdatahub, ...

The implementation used to validate the current specification is based on:

  • W3C Verifiable Credential, W3C SHACL and W3C SPARQL for the validation and verification of the models mentioned in the Scope section.
  • W3C DID, X509 and JOSE for the cryptographic chains of trust.
  • ETSI TS 119 312 and EBSI APIs for the presentations of the trust anchors.
  • OIDC4VCi and OIDC4VP for the exchange of W3C Verifiable Credentials.
  • TRAIN to discover the ecosystem rules and trust anchors.

Eclipse CANought

Friday, January 12, 2024 - 16:23 by Martin Brown

Eclipse CANought offers the automotive-specific capabilities to enhance the Eclipse Kanto project to offer a comprehensive solution focused on the automotive industry. Eclipse CANought provides extensions to the Eclipse Kanto project focused on automotive market segment capabilities by standardizing and securing CAN communications. 

It is critical that Automotive IoT applications have a standardized and secure solution for CAN communications with embedded controllers. The Eclipse CANought extensions will make Eclipse Kanto more appropriate for automotive-focused implementations.

Eclipse Corinthian

Thursday, November 30, 2023 - 04:39 by Andrew Katz

The Eclipse Cardinal Program has identified areas where template and process documents would add value to legal processes involved in technology, specifically in the areas of procurement and M&A activities involving open source software.

The documents will be hosted in a git-based repository enabling open source development techniques and methodologies to be used. Anyone can raise issues with the documents, fork the documents (for example to create jurisdiction-specific versions of them), and issue pull/merge requests.

Release versions of the documents will be made available in markdown, docx and plain text formats, through a simple and clear web frontend, meaning that users who don’t wish to engage with the development process can easily access them directly. The liberal licensing model will allow unrestricted use, modification and re-distribution.

Overview

The initial set of documents was developed by Moorcrofts LLP, a law firm based near London, in England in association with Orcro Limited, an Open Source Consultancy, also based in the UK. Moorcrofts and Orcro are both OpenChain partner organisations (openchainproject.org) and have been working in that capacity to develop:-

1. a due diligence questionnaire and set of warranties for acquiring software from a developer using open source software; and

2. a due diligence questionnaire and set of warranties for use in M&A transactions involving a target which develops software using open source.

These will be hosted by the Eclipse Cardinal Program from day one. We also have a suite of documents drafted to facilitate the supply of services over the internet using microtransaction architectures, which have also been developed by specialist law firms in a number of jurisdictions worldwide. The roadmap includes the development of software intended to facilitate the drafting, assembly, storage and analysis of legal documents. For example, since drafting contracts shares many characteristics with writing software, we propose developing a module for the Eclipse IDE which facilitates this.

Due diligence and warranties for open source development: procurement and M&A

The open source procurement and M&A process has historically focussed on specific releases of supplied software (for example, by analysing the composition of that release, and reviewing the licences for each component within the release). This is becoming less and less effective as a means of analysing and determining compliance risk as software development moves to a CI/CD model (continuous integration/continuous deployment/development).

A much more effective approach is to focus the warranties on the development process itself, and the processes, policies and procedures which the developing organisation uses to manage that development process. An ISO standard, ISO5230:2020 (OpenChain) defines the characteristics that a development program must have in order to manage open source compliance risk effectively, and the standard lends itself to a framework both for due diligence, and for warranties, both in procurement and M&A. The beauty of this approach is that it does not require that the target is compliant with, or even aware of, ISO5230:2020 (but it does mean that applying the process to a compliant organisation is that much more straightforward).

The initial set of due diligence questions for procurement has been developed using the ISO5230 framework, with the input from many active members of the OpenChain project, and the procurement terms have themselves been adapted to form the M&A due diligence and warranty suite.

Eclipse FA³ST

Monday, November 27, 2023 - 03:55 by Michael Jacoby

Eclipse FA³ST aims to provide an implementation of the re-active (or type 2) Asset Administration Shell according to the Asset Administration Shell (AAS) specification by Plattform Industrie 4.0.  In contrast to other implementations such as Eclipse BaSyx, Eclipse FA³ST will focus on the deployment at edge-level rather than cloud-level, meaning that its primary focus will be on asset connectivity rather than scalability. Other expected key features are

  • Easy to set-up & use
  • Open architecture which enables easy customization and extension
  • Implementation of API for HTTP and OPC UA
  • Protocol-agnostic synchronization with assets
  • Usage via command-line interface, docker container, and as embedded library

Eclipse WattAdvisor

Tuesday, November 21, 2023 - 11:01 by Jonas Pemsel

Eclipse WattAdvisor provides a Python library that consists of several functions and classes to build and solve a linear optimization model based on certain input data and parameters. The optimization model represents a local energy system composed of different energy components to produce, store, or transform energy.

Input data can contain information about the following aspects of the local energy system:

  • geographical location
  • annual energy demands per energy type
  • Installed power and/or capacity per energy component already existing in the system
  • Potential power per energy component that can be added to the system at maximum
  • Custom cost parameters per energy component
  • Prices of tariffs to purchase energy from external sources for the energy system

With this input data, a predefined function to start the automatic processing of the model can be called. At first, it builds a generic optimization model formulated with the Python library Pyomo and parameterizes it according to the input data to create a specific optimization problem. Here, the target function to minimize equals the sum of the total cost of all components to be used for the component composition. The total cost per component consists of at most:

  • Investment costs converted into annual costs by an annuity factor based on the expected lifespan of the component
  • Annual operational cost
  • Annual energy purchase cost

Investment and operational costs are calculated by applying specific cost factors of a component, while annual operational costs are determined using the expected amount of energy purchased by the component and energy tariff prices given as input. It should be mentioned that the total cost of each component is formed by an individual combination of the three possible cost factors, e.g., the total cost of an energy purchase component consists only of the annual energy purchase cost. 

The optimization problem is then passed to an open-source solver to create a solution. This leads to a solution that follows the fundamental target function to find the cost-minimal composition of energy components and remaining energy purchases to supply the parameterized demands.

After a valid solution is found, the relevant result data is exported either in machine-readable (JSON) or human-readable (Excel) format.

Energy balances are formulated in the model for these energy carriers:

  • Electrical energy
  • Thermal energy
  • Natural gas

Currently, classes to implement the following component groups are predefined and available in the project:

  • Combined heat and power plant
    • Transforming natural gas into electrical energy
  • Electrical energy storage
    • Storing electrical energy
  • Energy demand
    • Consuming energy for the system
    • Can be parameterized for all energy carriers
  • Energy feed-in
    • Consuming energy from the system and generating income
    • Can be parameterized only for electrical energy
  • Energy purchase
    • Producing energy for the system
    • Can be parameterized for all energy carriers
  • Gas boiler
    • Transforming natural gas into thermal energy
  • Heat pump
    • Transforming electrical energy into thermal energy
    • Efficiency estimation by the usage of historical air or ground temperature weather data
  • Photovoltaic plant
    • Roof surface
    • Free field
    • Producing electrical energy for the system
    • Energy production estimation by the usage of historical solar radiation weather data
  • Solar thermal energy plant
    • Producing thermal energy for the system
    • Energy production estimation by the usage of historical solar radiation weather data
  • Thermal energy storage
    • Storing thermal energy
  • Wind power plant
    • Producing electrical energy for the system
    • Energy production estimation by the usage of historical wind speed weather data

By adding new component groups, this list can be extended, which in turn contributes to extending the scope and applicability of the model.