Think!First Field Test - A Technical Retrospective
During the Think!First field test, numerous technologies were used. In retrospect, this article provides the most important facts and figures.

During the Think!First field test, two clusters were continuously operated in different data centers within about 170 days. For the integration into the GrĂ¼ne Erde web store all page requests of the live operation had to be processed. Depending on the test participation, it was then decided which additional content would be displayed in the web shop.
Numerous open source technologies were used in the development and operation of the distributed systems. The dominant programming languages were Scala, Ruby, TypeScript and JavaScript.
Application Life-cycle Management
For Application Life-cycle Management (ALM), Attribui's DevOps automations and the open source cluster technology of HashiCorp were used. Current services were published directly from Atlassian Bitbucket's Version Control Management (VCM) via Terraform into a Nomad/Consul/Fabio cluster on DigitalOcean. The published services were accessed via load balancer.
In the web store, dynamic and asynchronous features were integrated into existing websites via JavaScript. This was done in compliance with all data protection and ethical requirements.


The Setup
Application Cluster
The application cluster served to reliably make all (micro-)services available. CoreOS served as the operating system. Clients were published in Native JS/SCSS and in Angular, as well as APIs described by the OpenAPI specification (2. 0 and 3.0). In addition, test and acceptance systems and tools for working with RESTful APIs and OpenAPI documentation were made available to the project team.
In the course of the field test, 10 system updates were carried out automatically.

OpenAPI / Swagger
All features, changes and fixes were delivered by a standardized ALM. The server-side communication was done via RESTful API and was documented by the OpenAPI specification.

Database Cluster
For data management, a cluster of the document-based MongoDB database was set up on Amazon AWS. The nodes ran in a Virtual Private Cloud (VPC) on European infrastructure. All data was stored encrypted and regular backups were automatically stored.
For the database cluster, the IPs of development, test and live systems were whitelisted. It was not accessible for all other IPs.

Web Security
The permitted communication targets were reduced to a minimum and it was ensured that only approved and tested scripts were used.
Monitoring
Extensive monitoring enabled the control of availability, access, access times and predicted system changes.

The Field Test
Incoming HTTP requests were delegated to individual services several times per second. Depending on the season, between 20 and 80 data connections to the database cluster were permanently open. All interactions of interest for the field test were recorded using event sourcing. Levels of individual recurring participants were dynamically aggregated accordingly. Actor Systems in Scala and Reactive Streams were used for this purpose. A total of around 20,000 relevant events were recorded. The average session duration was 9 minutes. On average, 66 events and actions were recorded per participant.