Bitcoin, Crypto currency, Blockchain

Helix Co-Founder Marcel Fohrmann in the Tech-Interview: “Our Tangle is decentralized”

ba0dae9dde631716510f8f15d9f7cece - Helix Co-Founder Marcel Fohrmann in the Tech-Interview: "Our Tangle is decentralized"


Helix is still a young Berlin-based Blockchain-Start-up, working on the Tangle technology. Tangle is many in connection with IOTA. The promise is huge: no scale problems, no cost. Not a few criticize, however, the Central role of the Coordinator. Helix would like to make it better and has found a way to enable the decentralized organization also in the Tangle. To scratch only on the surface, it will be gone in the subsequent Interview also into the technical Detail – not IT experts, but nevertheless, at your expense.

In the case of the keyword of Tangle you have to immediately think of IOTA. What has your Tangle with the IOTA in common, what is different?

The Tangle is, in our view, first and foremost, a very descriptive term for a seemingly chaotic storage arrangement. In this sense, the Helix Tangle no different from that of the IOTA-Tangle. Transactions are shown in a Directed Acyclic Graph (DAG) for further processing, which brings benefits in terms of scalability, a key advantage to a Blockchain.

In the Tangle of new transactions to be linked by a “Tip Selection”algorithm to the graph, which is the Weights of transactions to simulate a “Random Walk”, in turn, provides two transactions, the apply to refer to it as a result. Previous implementations of the Tangle, however, exhibit an instability in the “Tip Selection” – in practice, some transactions end up as a so-called “orphans”, referenced from newly linked transactions never. Our Data Science Team has analyzed the existing procedures of the “Tip Selection” in practice, a variety of optimization opportunities found and realistic simulations of the solutions implemented.

Furthermore, Helix uses a binary instead of a trinary interpretation of the Tangle. We understand the benefits of a trinary interpretation and, in particular, under execution in dedicated Hardware, however, Helix is pursuing the goal of maximum end-user compatibility. Accordingly, the cryptography was implemented in a binary, where we adhere to the NIST Standard and in a highly experimental procedure without. In addition, we hold it at present, for the very daring to make assumptions about the resistance against quantum computers, however, we also deal intensively with this challenge.

Another Problem of the Tangle lies in the Incomparability of elements. Timestamps, for example, can be semantically validated. Unlike in the Blockchain, it is possible only with difficulty, and an intuitive linear order out of the Tangle to derive. Thus, elements that are considered to be not comparable exist. Therefore, we work in close cooperation with research institutes on a method to assign partial quantities of the Tangle efficiently, without losing semantic validity. This is the basic requirement for the planned support of Smart Contracts, for the vital is important, the order in which transactions are executed. For example, Smart Contracts, which are used in the Exchanges must have a possibility, the order of the “bid” to view to the digital Asset to the first bidder transmitted.

How are you planning on the decentralization, and what is your Incentive for the network participants?

The Helix Network leads to so-called “Validator Nodes”, whose primary task is to miles to establish stones to the Tangle and assign the “Cutset”, i.e., an interval of milestone up to and including the milestone+1, linear. Milestones, transactions bearing a particular digital signature and, ultimately, for the confirmation of transactions are are. The validators would sign by means of the “Schnorr Digital Signing Algorithm” to be set milestone collective, so that each network participant is able to demonstrate the authenticity and the origin of the milestone.

The validators run a modified “practical Byzantine Fault Tolerance algorithm” (pBFT), in order to find agreement on the milestone and order. The probability to be as “Primary” or “Leader” chosen in Proportion to the respective “Reputation” of the validator. The election of the Primary edge is used for “stove” (Scalable Bias-Resistant Distributed Randomness). The network is completely permissionless, so also joining in the group of participants of the validators does not require any permission – it is the Reputation on the Existence or non-existence of the Validators shall have sole and alone. The reputation value of the validator derives from the totality of Proof of Work, which was assigned to him by full nodes, or light nodes. In the Helix network, each participant decides, inevitably, for a certain Validator, which he undertakes with the respective transaction. By the way, this basic concept allows a seamless Integration of Smart Contracts and Proof of Useful Work.

In addition to altruistic Incentives such as, for example, the very low energy consumption an economic interest. Validators will have the ability to manage so-called “Fabrics”. Fabrics are a combination of “Compute Nodes”, which a certain task are assigned to, for example, protein folding. A Client can send its data set to the processing of a “Fabric-Primary” (e.g. Validator), which this, in turn, to the task specific Fabric. The reward is fairly distributed among all Compute Nodes within the Fabric, during Fabric-Primary receives a Commission.
In this respect, a Validator has a strong incentive to improve its Reputation.

What are the Use Cases that you are targeting?

Helix and some of the partners are currently working on a variety of Use Cases, which should serve as a timely Proofs of Concept on the TestNet. More details will be announced in due time.

Our main focus is currently on “Beams”, a Bio-Marketplace, to enable the secure Transfer of sensitive Bio-data, and thus a milestone in personalized health care. Furthermore, the Stream, or Subscribe to a stream of biologically valuable data against compensation. It is important to note that the integrity of the data must be quantified, in order to create an attractive offer for research and industry.

Our second Use Case, we can mention at this point already, is a “Decentralized Cloud Computing as a Service”, in the center of the above-mentioned Fabrics. This is, as already mentioned, a combination of Compute Nodes, the counter-compensation-resource-intensive computing tasks to perform. In this way, sense-free “Proof-of-Work is eliminated” and at the same time an attractive reward system for our network participants. Our Decentralized Computing, in conjunction with Storage and databases-as-a-Service is a “Decentralized Cloud” (DPaaS).

Another Use Case that we can communicate, offer “smart nodes”, which are “Cognitive Security Operations” as a Service. Smart nodes belong to the group of the light nodes and have the ability to diagnose neighborhoods within the network by means of cognitive procedures. The diagnostic program is composed of several analysis layers. On one of the layers is a “Differentiable Neural Computer” (DNC), which can make use of in comparison to its predecessors (“Deep Neural Networks”) already last Know surgery by providing an additional memory matrix is provided. In classical neural networks, the edges will be overwritten weights in the process of “Back-propagation” so that intermediate results, which may have been extracted from the previous, in the context of third-party data sets, in spite of their value is lost. This phenomenon is also known as “catastrophic interference”. The advantage of a DNCs is shown in the method, to be able to write into the memory matrix and read from it, without affecting the edges weights. A DNC has the opportunity to re on the basis of the underlying task independently and target-oriented-programming. The DNC layer to really provide real-time inferences about the topological structures of the Tangle, which may include, in combination with certain parameters – such as absolute, a maximum difference in the transaction rate or the average delay of the transmission of messages between Nodes with valuable information about possible attacks.

How did you get on with Helix? What are the next steps?

From a technical and developer point of view, we concentrate us at the moment (August 2018) on the Helix Protocol. This is still in the Prototyping Phase, in the stability and security proofs for the presented methods. Model, and cryptography for Reviews. Internally, the Protocol as well as the first Use Case prototypes running already for some months now, stable. A wider closed user group to be granted in January 2019 access. Shortly thereafter, the provision of a first Open Alpha.

The Wallet is already fully implemented, and soon in the second Prototyping Phase, in which Peer-Reviews and Security Audits are held.

The MainNet Launch (Open Beta) will take place at the latest in Q2/2019. It is planned to launch the network with the full core functionality. However, there are at least some of the functional areas of the Open Beta at the time still in an experimental stage.

Since the network operated in the future by a non-profit Foundation, maintained and promoted, we have established the Helix Foundation with headquarters in Berlin, which has recently been registered officially, and about our ICO will Finance the start-shot for the 30. September of 2018 is planned.

Leave a Comment