It was roughly 2 and a half years ago that the founders of Humanode and the original core of the team realized that the humanode concept may actually be technically possible now due to the advancement of technology.
It is needless to say that some of the original ideas and the original technological frameworks were tried and tested only to find out that they really don’t work, or that they were too early stage to implement. The whitepaper had to go through more than one revision, and the number of videos or articles that had to be replaced was heart-breaking.
Many of our engineers who are featured for their work on the Humanode protocol, like Mozgiii, Noah, Henry, and Tony have become well known in our community, and yes we plan to feature their story once the media team is able to decode their diaries. But today, we would like to present you with the story of some of our team members who are a little more hidden but shape the core of the Humanode bio-authentication technology, where they are heading, what their goal is, and where they are.
The consensus team
Dmitry Lavrenov, one of the core engineers of the Humanode consensus team states that they had 3 clear goals in heading into the current development.
- the creation of a proper consensus engine that supports required performance, scalability, and latency
- bioauth authentication integration into consensus engine
- to support EVM-compatibility
Dmitry, who loves challenges, remembers the path fondly. “Our journey has started from researching existing state of art approaches that achieve the best results in terms of performance, scalability and latency. We decided to define important things to come up with something that will suit our needs in terms of understanding the system model itself and extracting 5 core consensus components: block proposal, block validation, information propagation, block finalization, incentive mechanism. I would like to mention that various design choices in the consensus protocol can greatly impact a blockchain system’s performance, including its transaction capacity, scalability, and fault tolerance.”
As a result, the consensus team researched The Nakomoto consensus protocol including some improvements like GHOST and Bitcoin-NG, Proof of Stake based protocols, Committee-based PoS, BFT-based PoS, Delegated PoS, Proof of Authority, Proof of Elapsed Time, Proof of Retrievability (Proof of Space).
Dmitry smiles while looking back. “We also considered novel DAG-based approaches that look amazing if you are looking at performance: SPECTRE, PHANTOM, IOTA Tangle, Byteball, Nano, Avalanche. Additionally, we spent some time looking deeper into the Avalanche approach as it achieves good results in real life, not just in theory. Remember, the only thing we had at the time was a theory on how it could be done”.
After considering different frameworks that would allow humanode to implement its own protocol and code in a more flexible way, the team chose to work with Substrate, a modular framework that enables the creation of purpose-built blockchains by composing custom or pre-built components.
“We chose Substrate after carefully evaluating the alternatives (build the code from scratch and using one of the other existing codebases) for several reasons. The first reason is that Substrate is designed to be used as a platform to build blockchains. In other words, it is by design a developer tool rather than a final product. Using it was more appealing than basing our development on a codebase that implemented a particular blockchain project. The second reason is that Substrate code quality is rather high, and it is clear that people working on it care about the quality. And the third reason was that we decided that we want something that is a library rather than a framework, and with Substrate, the flexibility and modularity of the code are exactly where we needed them. If we were building the code from scratch, we'd go with a very similar approach to the one that Substrate developers have taken”
Other noted reasons were that Substrate is being constantly worked on and improved by many people meaning that Humanode can get a stable stream of improvements simply by building on Substrate. There is also a vibrant community around Substrate, meaning that there are many people the team can talk to and ask for help if we face any trouble with Substrate itself.
Substrate has proven to be an excellent tool so far, and it allows our team to focus on issues specific to Humanode, rather than spending time on the common ones of blockchain building.
And finally, Substrate additionally provides already implemented slot-based consensus protocols like Aura and Babe for block production and Grandpa for block finalization.
The Machine learning team
One of the main features of the Humanode protocol naturally has to do much with the biometric authentication protocol. And in order to make bioauth a reality, we need to work with machine learning and liveness detection.
The humanode machine learning team consists of members who, above anything, enjoy their privacy and rarely show themselves in public, but they have agreed to let us take a peek at what they are working on.
“We have been focusing on building machine learning infrastructure for our humanode bioauth system and have been developing facial recognition and liveness detection (face anti spoofing) systems on our own. The accuracy of our state of the art facial recognition algorithm is more than 99.85%. Oh, and we have already developed the initial version of the passive and active liveness detection system”
Liveness Detection systems allow the network to determine if it is interfacing with a physically present human being and not a spam bot, inanimate spoof artifact, or injected video/data.
“For those who don’t know, passive liveness detects liveness from just one image. Our passive liveness detection system can detect presentation attacks like photo attack, screen attack, silicone mask with great accuracy - which is currently at more than 95%. Our active liveness detection system gives the users the challenges like blinking their eyes, turning their head left or right, making specific faces, or showing certain emotions, and the system determines liveness based on the results. Oh, and yes, we even developed an internal website for testing facial recognition and liveness detection and It's being widely used and tested by the humanode team members.”
Nowadays deep fake technology is one of the greatest threats to biometric authentication systems.
“So we developed an initial version of deep fake detection but as deep fake technologies are developing rapidly this field still needs to be developed further.”
The machine learning team is currently focusing on combining cryptography with facial recognition and liveness detection.
MingDong, a core member of the machine learning team smiles while he says “The facial recognition part is already implemented and will start working on liveness detection combination soon. It might mean to develop a totally brand new machine learning framework from scratch but our machine learning engineers have similar experience so we can carry this out successfully. This is the most challenging machine learning task I have been up to so far. But we have gone through lots of things and I strongly believe that our machine learning algorithm can ensure the idea of one human = one node."
WangXing, MingDong’s development partner says "My mission is to make sure that one human is one node to bring Sybil resistance to crypto using facial recognition and liveness detection. I am really proud of being given the opportunity to work on such an innovative project and I am pretty sure that you will be enjoying being a human node very soon. I am a Human node!"
Meanwhile in Cryptography
The cryptography team has one of the more challenging tasks in the whole Humanode project. While Humanode works with partners, such as FaceTec for the first rendition of Humanode’s bio-authorization, the task of the cryptography team is to deep dive into the world of cryptography in order to develop the next generation of humanode biometric authentication that the team hopes to show the world in 3 to 4 years. Fortunately, the Humanode cryptography team is ready for the task.
Rafael Gelder, the lead Humanode cryptographer, grins when he talks about what we are trying to build.
“The next version of the biometric authentication system is based on a Neural network that extracts the feature vector from the user's face image privately in the node. Naturally, several questions need to be solved from the cryptographic point of view: 1. How to safely send the feature vector to an authentication authority to gain access to the system? 2. How to identify the users privately, preventing Sybil attacks? 3. Who is this authority in a decentralized system, and how is it configured using cryptographic protocols? And 4. What is the process to verify that the calculations are performed correctly by the user's node?”
The solution to the decentralization of the cryptographic protocols is the Collective Authority. The Collective Authority is a subset of the Humanode network that will be in charge of defining the basic parameters, generating the keys, and several other essential calculations during the authentication process.
“Once we had the goal well defined, we in the cryptography team were devoted to researching and implementing the cryptographic protocols, namely, key-generation, encryption, decryption, matching process, operations over the encrypted vectors, and verification.“, Says Hardik, Humanode’s cryptographic researcher.
The Collective Authority (CA) is responsible for performing a matching process over an encrypted feature vector. Humanode uses LWE-based encryption, which allows our protocol to perform a homomorphic operation over an encrypted feature vector. Since the matching algorithm also involves 1 to N matching, every node must use the same public key for encryption; the corresponding secret key is never shared with anyone else.
Rafa continues. “Given two encrypted feature vectors, any node can perform a matching process over it. However, we need to decrypt the final output to get the matching score. Since the collective secret key is in a distributed form, we designed a decentralized decryption algorithm in which each node in the Collective Authority performs partial decryption using its partial private key. Then the CA leader will combine all partial decryption to get the final matching score. Without the involvement of other nodes, a node cannot perform decryption of a feature vector or matching result on its own. Since we are not considering any third party, we also have a zero-knowledge-based proof system. The ZKP system will ensure that each node in the Collective Authority performs its task in the key generation and decryption processes as intended without revealing any secret values. Trust is established in the system without explicitly trusting any participants”
“The verification system can detect if any node is acting maliciously and if this occurs multiple times, then the particular node may be banned from participating in the Collective Authority”, says Hardik. “To maintain a certain number of nodes in the Collective Authority, we may require adding a new node. To ensure this, we make the system dynamic where the Collective authority can easily add a new node without changing the public key”
The cryptography team has a working implementation of the protocols in Python and says the reason behind choosing Python was based on compatibility and ease of integration with the Neural Network and Liveness detection systems that are in Python.
Having said that, the cryptography team ensures that Python is not the end. “Now we will focus on the implementation in Rust of the cryptographic schemes so we can have better performance and low-level handling of big integers”
Looking into the future, the cryptography team talks of one more major task.
“The feature extraction process during authentication/registration will be done on the source node device, and the extracted feature will be shared with other nodes only in encrypted form. Because of this, a malicious node might tweak the NN parameters and produce a different feature vector for the same biometric data. Since the feature vector is in an encrypted form, a receiving node will never be able to detect it and proceed to perform the normal matching process. The result will be a Sybil attack as we would have two entirely different feature vectors for the same biometric data. To prevent this, we are working on a Zero-knowledge proof system for the NN, which allows a node to verify that the encrypted vector is indeed encryption of the correct feature vector without decrypting the encrypted feature vector. This ZKP system involves a series of layers of the Neural Network”.
The cryptography team fully agrees with the machine learning team, that this will be an up-hill battle, and an arms race with those who will try to hack the bioauth systems. Having said that, both teams are up for the challenge, and fully expect to be more than a few steps ahead of the game.
Bringing things together
Dmitry from the consensus team reminds us that their work is prepared for integration with what the other teams are cooking.
“As we mentioned before, Substrate is quite flexible in terms of implementation for different goals. It consists of different protocol layers, including consensus engine. So, in order to make things work, we started to look for a way to add the biometric authentication portion of the protocol here by looking deeper into Aura, Babe and Grandpa implementations. As a result, we analyzed the whole way from submitting a transaction to the network to be a block finalized that includes the submitted transaction, researched who is able to produce blocks or not, got possible options to change the validators list, understood approaches to punishing malicious behavior, and whatnot. And finally, we were able to implement the Humanode bioauth consensus, a deterministic consensus protocol that is responsible for validating whether block authors of proposed blocks have successfully passed biometric authentication.”
One other point that we cannot forget is the fact that the Humanode bioauth, or crypto-biometric solution, is not only for the Humanode network. It is built as a layer one technology to work with a majority of the decentralized networks that exist.
“To bring cryptobiometric technology to existing protocols, the Humanode network includes an EVM pallet that allows it to run Solidity smart contracts and use existing developer tools. By bridging Humanode to other EVM-compatible chains, the network will be able to provide private biometric processing and Sybil resistance to dapps and protocols based on other chains. A biometric smart contract written in Solidity deployed on a needed chain will communicate with, for instance, a decentralized finance protocol, and then send the request to the Humanode network where biometric data is stored. Without revealing the user's identity, the Humanode network sends ZK liveness proof and identity checks back to the DeFi protocol to prove the user is the same real human being without using any PII (Personally Identifiable Information)”
Fortunately, the team working on the EVM palate found out that the Parity team had already implemented this EVM pallet to provide EVM compatibility features.
“So, we researched their implementation, suggested some improvements that have been merged into their code as well.”, says Dmitry. “Finally, we got EVM integration into our code including a bioauth precompile that allows us to verify if the provided ethereum address has passed our biometric authentication or not”
After note: As our dev team advances step by step, the media team will continue to try to decode the scribblings left behind by our brave developers, and will attempt to deliver the decoded pages from the dev note to the community as we move forward.