Digital Twin

V2

Announcing the latest Digital Twin release

Meet your new protocol, same as the old protocol.

Well, if that was strictly true then there wouldn’t be anything to write about but you may need to squint to notice that something has changed with this release, while the irony is that almost everything has. As a protocol that operates at the infrastructure layer allowing other ticketing companies to build on top of our tech we take stability and continuity seriously and part of that obligation is making sure everything works just as it should, no matter when, where, or who created tickets with us.

For those new to GET Protocol we offer a number of solutions depending on your needs as an Event Organiser, Partner Integrator, or Ticketing Company and they all work together to help you manage your event through the complete lifecycle and to mint NFT tickets for your attendees. Here we’re releasing a big update to our Digital Twin product; the foundation layer responsible for creating all of our NFTs on the blockchain.

From First Principles

At the start of the year we set out to build the only NFT ticketing platform ready for global scale and we’re now launching the next iteration of the Digital Twin that is ready to take on that challenge, and because all NFTs on GET Protocol are minted with Digital Twin APIs these benefits will be available in our white-label product too. In the world of blockchains and real scalability nothing can be taken for granted, so for the last 6 months we’ve left no stone unturned in pushing towards that goal.

As a blockchain protocol we operate on a number of principles:

  • We believe in open, public, and permissionless networks. Our mission is to become the data standard for ticketing worldwide and this cannot be possible cutting corners with ultra-high-speed, ultra-non-transparent “private blockchains”.

  • We believe in utility and experience. Minting a million NFTs on a ghost chain with subsidised fees would be simple but what is scalability without utility? We expect our NFTs to have integrations with top-tier marketplaces, social media platforms, and wallets.

  • We believe in affordability. Thanks to the wonders of transaction fee markets and mempool priority we have one unsustainable option to scale up; pay for it. A data standard will only become the standard if it’s affordable to everyone.

  • We believe in on-boarding global volume to Web3. 20 NFT tickets a day won’t move the needle towards global adoption and if we want to be serious about this then we need to be ready to scale with the industry.

So where does that leave us? We need to be able to mint massive amounts of NFT tickets onto well integrated and well established public blockchain networks cheaply, and for the last 6 months this is what we’ve been working towards. Easy peasy.

TL;DR

This article will be going to be into depth about the development, decision-making and improvements made for our V2 release as well as flaunt the amazing work that has led us to this point. For those that are just here for the headlines, here’s what has happened:

  • Our new V2 Ticket Engine API will be rolling out giving us a 316x boost in peak throughput and 11.8x improvement in minting costs.

  • We’ve rolled up a number of product improvements to our blockchain structure to give event organisers more control over their events with each event being its own smart contract.

  • We’re releasing our new Integrator Dashboard to put integrators further in control of managing their account and topping up their balance.

  • A number of small but important improvements have taken place; further decentralisation of the ticket explorer and a clearer path forward to a multi-chain GET Protocol.

  • In V2 all ticket revenue is on-chain and settled in GET.

A Globally Scalable Platform

To define success you need to set a line in the sand. Around the time we started to take the scale up attempt seriously GET Protocol hadn’t long passed its millionth ticket and as conversations about the design and architecture of the new system started to unfold that number was fresh in our mind. How about if we could do that in a week? 52 million a year sounds like a target to hit! Without knowing what we were getting into, that became the goal.

The throughput of V1 can be quite easily found as yearlyThroughput = maxTicketsPerDay * daysInYear and based on organic usage it was understood that the existing blockchain relayers were able to process around 20,000-30,000 transactions per-day. Per-year that works out to be 7.3M-10.95M tickets and is already falling short of our hopeful target by some margin. When benchmarking it is always safer to take the lower-bound so we felt confident that we could produce up to 7.3M tickets per-year under the live system, high enough to handle current demand but low enough to show some slight lag during heavy volume days of concurrent stadium events. 52M/year would represent a 7.1x increase in throughput, an ambitious target but it’s better to aim high.

Exploring the Sea of Standards

The power of an NFT lies in its interoperability and its ability to plug and play easily into tooling and products already existing within the wider crypto ecosystem. Straying away from the standards allow us more flexibility to mint tickets more efficiently but reduce our integration potential with marketplaces and social networks, something worth keeping in mind at all times when choosing a standard to work with. Our initial discovery led us towards ERC-1155, a less common but more efficient NFT standard with OpenSea support and an increasing level of popularity among developers. Although promising, after some experimentation we decided that the support of third-party applications wasn’t quite at the level needed and would risk leaving the ticket buyers an NFT with compromised utility and fewer guarantees of support on their favourite marketplaces, NFT galleries, and wallets.

Keeping our NFTs within the ERC-721 standard was the direction to take for maximum utility and that meant finding a way to mint 721s as efficiently as possible by working on the other two main inefficiencies; storing more data than necessary and processing only a single ticket per-transaction. If these could be improved then there was a good chance of finding a viable path forward with ERC-721.

Reducing our Gas Usage

V1 of the protocol was designed for flexibility and ensuring that all data could be recorded about a ticket in the most granular and accurate way possible, which is often the right way to approach all software. In the early phases of a product it is not yet obvious how it will be used, the features that will be needed, and those that won’t stand the test of time so it’s common to ensure you’ve covered all bases and that worked well for us. But the earlier you are in a product’s lifecycle the less information you have; you should refine as you go. While accounting for all future use-cases gives the most flexibility it also increased bloat and offers us opportunity to optimise now that we’ve gained experience. So that’s what we did.

Approaching an optimisation task is often like plugging holes in the bottom of a sinking boat, you’ve got to start with the biggest and baddest ones first and in Solidity this would be the amount of data you store on-chain. Because each NFT is its own boat, multiply that leakage by the number of tickets created and it quickly becomes important to reduce our on-chain storage before attacking other issues.

Solidity and the EVM meters gas usage for storage in the number of ‘slots’ that are used, where each slot represents a 256-bit number and each bit is a separate 1 and 0. This means that you have 256 individual ‘bits’ to use and each of those can be set to a 1 or a 0 and that’s it. Go a single bit over this 256 and you’ll get charged for another whole 256, no ifs and no buts. Each slot costs 20,000 gas and looking at one of our V1 tickets we can see that we’re currently using around 365,000 gas per NFT.

What is all that gas going towards? Let’s account for some of the main offenders:

struct TicketData { address eventAddress; bytes32[] ticketMetadata; uint32[2] salePrices; TicketStates state; }

The TicketData structure is stored for each NFT. An address is 160-bits which is used to link a ticket to a particular event, a bytes32 is 256-bits each and one is used to store the price in the currency of the event, two uint32s add another 64-bits to the tally, and the TicketStates add 8bits. Due to the way Solidity packs these into storage this means this is using a slot for the eventAddress, a minimum of one for the metadata, and one for the rest for a total of three slots for our custom TicketData.

Take-away: Huge improvements in storage requirements were made by rethinking and redesigning the way we represent ticket and event information.

But wait, there’s more… The ERC-721 standard itself needs to store some data to track the ownership of the NFT and the unique ID. In V1 this is using OpenZeppelin’s ERC721Enumerable contracts to handle the ERC-721 standards and this stores data in a way that it makes reading certain information of an NFT slightly easier at the expense of making production of an NFT a lot more expensive. Others’ have written at length about ERC721Enumerable’s inefficiencies but all you need to know here is that it adds a lot of additional overhead that we can do without, estimated at an additional six or seven storage slots on top of our three. And let’s not forget our on-chain economics as each ticket has its own fuel tank that stores the amount of fuel it has for the rest of its lifecycle. Removing from the integrators account balance and adding it to the ticket balance accounts for another two slots worth of storage. All said and done this storage accounts for 11-12 slots in the region of 220,000-240,000 gas per NFT.

V2 reduces this down to a single slot with 20,000 usage, with some bits to spare:

struct TokenData { address owner; uint40 basePrice; uint8 booleanFlags; }

The owner address cannot be avoided as each NFT must have an owner, giving us the first 160bits of storage. We assign the next 40 towards storing only the basePrice, which is the USD denominated value of the ticket at the time of sale. Then some clever tricks are used to represent 8 different states as part of a single 8-bit integer that can store whether a ticket has been scanned, checked-in, invalidated, or claimed. Because each event is its own smart contract we no longer need to store the event address with each ticket and this leaves us with a positively healthy 48-bits to spare for future upgrades. But what about the on-chain economics? The fuel tank? The rest of the NFT standard?

Batching our Minting

The final big target in our sights for efficiency was our handling of 1 transaction = 1 mint and when you consider all the overheads of merely confirming each transaction you’re better off batching these into chunks. Soon after the release of Ticket Engine V1 we quickly realised that the maximum number of transactions you can have queued for a single address (while still making sure the network will confirm them) was 16 per address due to a limit enforced by peer nodes within the network and since each integrator can have its own relayer address this allowed for 16 transactions to be queued per-integrator at any given time. As these were confirmed more could be sent for confirmation.

By batching the ticket actions the V2 contracts are able to push minting costs down drastically.

Building V1 without batching forced us to make every transaction count and pushed development towards building a system that was very good at confirming transactions at scale. Despite some challenges at the start our development team pushed onwards and Ticket Engine got very (very!) good at confirming transactions in all weather conditions of the Polygon network. Despite this we didn’t at all maximise the value of each transaction confirmation, leaving a lot of growth potential on the table. Or in other words, these transactions were getting confirmed amazingly well no matter the gas price or the competition, but we were then using them to do a single NFT mint losing all that advantage we’d built in the infrastructure layer.

So to increase our volume you can use another simple equation of maxTicketsPerDay = maxTransactionsPerDay * ticketsPerTransaction and it was this tickets-per-transaction that we began to focus in on. If only 10 tickets could be packed into each transaction then our goal of 52M/year would be achieved and everyone would be happy. The first design of the new smart contracts batched a single action-type but this felt clunky and inflexible so after a number of breakthroughs batching all ticket interaction types is now possible with V2. We’re happy to say that this has now been tested successfully to 250 NFT tickets per-transaction, far more than the 10 we were hoping for.

Take-away: NFT minting is now batched so that multiple can happen in a single transaction and these can be updated as part of the batch rather than individually. Hoorah!

Batching ticket interactions in this way helps from all angles: our API has to confirm fewer transactions for the same number of tickets, and only needs to update the certain parts of the on-chain data per-batch rather than everything per-ticket. Calculating the GET fuel usage here is the perfect example of this as it requires writing the same two pieces of data to the chain; the reserved fuel needs to increase and the integrator’s available balance needs to decrease. When batching we no longer need to update these per-ticket and can be updated once per-batch. This same approach is taken for updating the NFT token balances and ownership data resulting in heavy savings when these costs are spread across the batch. Where two storage slot updates would previously cost a full 40,000 gas per-ticket, doing that same operation once for a batch of 250 socialises the cost to a featherweight 160 gas per-ticket.

To make this easier to follow we’ve charted out the impact of the number of tickets in each batch along with its total gas usage and the per-ticket gas usage. Even batch sizes as low as 10 can have a huge impact on the overall efficiency of each transaction.

Running an Event… Based Architecture

Scaling our blockchain contracts only tells one chapter of the story. A big part of our value proposition as an infrastructure provider for digital ticketing is that enterprises no longer need to hire a blockchain team just to experiment with NFTs. To bridge the Web2 and Web3 worlds we offer our Ticket Engine API for minting Digital Twin tickets of existing ticket inventory and to handle this demand we’ve completely rearchitected our services from the ground up to be event-driven and asynchronous.

In non-technical speak this means is that the API application code has been broken down into smaller modules that are highly performant at a single task and these modules are only activated when they are instructed to do. When the API receives a request to mint an NFT it is placed into a queue and instructs a module to perform a specific function on that request before passing to another queue. We’ve separated out the data validation, the processing, the batching, and the broadcasting of each transaction into separate pieces of code that can be built and executed independently of one another. This modular architecture not only gives us more technical scale but gives us a simpler development environment for future data handling.

Moving much of the initial request processing to later modules allows the API to respond to requests far faster than possible in V1, producing a confirmation response within ~5 milliseconds per action requested.

Developing applications that use the Ticket Engine API has been simplified in V2 and instead of having to separate our individual actions into batches of 100, we now offer a singular endpoint accepting a maximum of 5000 actions, a 50 fold increase. Previously, registering sold and scanned tickets would need to be separate API requests, but these can now be sent together and the API will now process them in the order received, no need for separation.

Taking Stock

When you bring it all together you end up with something quite powerful, which was confirmed on the initial production environment benchmark tests that resulted in 57,500 NFTs minted in 13 minutes and 6 seconds. What would have taken almost three days in V1 can now be handled in less time than a coffee break. The API and back-end services have been found to be highly performant at confirming transactions on Polygon mainnet and the stress-test proves it, showing a peak 316x efficiency improvement in throughput.

As for cost of minting NFTs, where a single Ticket under V1 would previously cost 365,000 gas, this has been compressed all the way down to 30,950 for batches of 250, an 91.5% cost reduction.

Upgrading Functionalities

This update is not all scale and throughput. We’re also releasing a number of key product upgrades and features that drive the next theme of our roadmap… desirability.

A Contract Per-Event

The number one challenge we faced with the composability of V1 NFTs was that they were minted on a single smart contract that contained all tickets from all events from all integrators and as the wider ecosystem of NFT tooling has evolved it has became more and more challenging to operate with this foundation. NFT marketplaces create collections based on the smart contract that created the NFTs, and DAO governance tooling can collect votes based on if you own an NFT from a particular collection (again, relying on the contract address.) The list of these examples goes on but it became clear that the right move would be to give each event its own smart contract to then allow creators and artists the ability to use these NFTs within their communities and platforms more effectively without developer assistance.

In V2 the tickets of every event will be minted by a unique minting contract specifically deployed for the event at hand.

This instantly opens up a number of possibilities to the utility of a ticket NFT:

  • It becomes easy to check whether an event attendee owns an NFT for a specific event.

  • Voting and DAO rights can be offered to these attendees using standard available tooling.

  • Community NFT-gated access is easy to use by allowing owners of a ticket the right to access certain Discord and Telegram groups.

One that isn’t quite ready yet for release is collecting NFT trading royalties to an address on a per-event basis, incentivising creators to produce valuable and tradable NFTs. This update allows that and sets the stage for future feature upgrades.

This is more than just a technical feat... all popular NFT collections are minted on their own dedicated smart contract and due to V1 minting all NFTs onto a single contract this reduced their desirability because of the lack of curation and personalised collection details.

Take-away: Every created event will deploy a new NFT minting contract. This will give every event its own OpenSea collection, improving uniqueness, scarcity, and personality.

A Portable Protocol

It remains to be the case that picking a winning blockchain is an exercise in betting and guesswork. Will there even be a winner? Will the future be multi-chain? Or will it be cross-rollup? Or neither? Truly, no one really knows.

V2 architecture allows deployments on multiple L1s (or L2s) at the same time.

The best way to hedge our bets is to build on the most common and compatible standards available to us while also ensuring we have the ability to quickly launch on new chains depending on integrator preferences and demands. It’s never been our approach to launch everywhere and spread ourselves thin and that remains the case but V2 is designed to allow for quick deployment to other EVM compatible chains if there is demand from integrators to do so. The keen-eyed may have already seen hints of this on the latest release of the ticket explorer:

The full ID of a ticket now includes its blockchain and the event ID to allow any ticket on the protocol to be found.

An Updated Explorer

The Ticket Explorer has taken steps towards greater decentralization and now fetches almost all of its data using only TheGraph. While it may seem like a small back-end change, and to many users of the ticket explorer it will be, it represents a crucial architecture shift in how we build our applications. While private data was never used within the Ticket Explorer, it is now much easier to confirm that is the case since the site is now generated using the subgraph data. The Ticket Explorer will continue to be developed as the primary place to view NFT ticket information and this new foundation helps us only operate on public blockchain driven data. A small win for the architecture but a big win for the ethos of an open protocol.

Builders and integrators wishing to use this data to create applications or user experiences can now do so and the code is publicly available on our Github.

An Integrator Dashboard

Saving one of the most exciting updates til the end, we’re now in the process of rolling out our Integrator Dashboard to existing integrators & partners. This provides a convenient way for integrators to top-up their GET fuel balance and providing necessary account management features such as balance alerts and usage statistics.

Integrators will now have a way to top up their account balance directly using GET within their wallet.

Partners that wish to have a deeper look into their usage statistics and blockchain data will be able to see the exact blockchain usage as well as the events that they have recently created along with their smart contract addresses. Guides and support documentation will be provided to partners as part of the rollout and will be available publicly from the dashboard.

As our partners are business entities we’ve taken extra care to build it to be business ready from launch, offering top-up transactions verifiable on-chain as well as top-up receipts in PDF form for accounting purposes. For those that prefer a simpler top-up experience we’re working hard to implement direct top-ups using bank transfer and we’ll release more information when this becomes available.

A Fuel Simplification

Depot? Silo? Fuel tanks? RIP we hardly knew ye.

We’ve moved away from a metaphor tied to fuel refinement and towards something that more accurately describes the accounting flow both within the code and documentation. For the economics to be widely understood it needs to be simple.

The accounting workflow is largely the same as in V1, however the three different states the GET fuel can be in have been renamed:

  • Available Fuel represents an integrators’ balance on their account. Already topped-up and available to pay for NFT mints.

  • Reserved Fuel is accrued from Available Fuel at the time of minting and is held within the system for the ticket’s lifecycle until it is checked-in or invalidated.

  • Spent Fuel is accrued from Reserved Fuel upon check-in and invalidation and is ready to be collected from the protocol to the destination addresses.

All simplification no matter how minor can only be a good thing, right?

As a ticket moves through the event cycle GET moves within the Economics.sol contract from Available Fuel, to Reserved Fuel and eventually will end up in the Spent Fuel balance.

In V2 all ticket revenue is on-chain and settled in GET

In V2 all ticket revenue is on-chain and settled in GET.

Before you go, there is one final thing…

In V2 all ticket revenue is on-chain and settled in GET.

All ticket revenue, service fees and sales taxes will be settled in GET via the Integrator Dashboard.

But I thought that was already the case?

Might be top of mind, and while it is almost true but there’s a little bit of history wrapped up in this question. The V1 economics live right now was based mostly on the token revenue numbers of the system that preceded it and inherited the status-quo when it came to the amount of GET needed to service a ticket. The on-chain real-time economics of V1 that you’ve seen live for the previous year was no doubt a lightyear ahead of the quarterly reports prior, but the usage numbers, pricing, and accounting processes within the company remained mostly unchanged. You can’t pivot on a dime and representing the GET requirement for NFT ticket mints was already upgrading a large chunk of functionality so much of the development of a fully on-chain accounting system was deferred until a later release; this one.

So what’s the difference?

For starters, the amount of GET needed to create a ticket on the protocol was disconnected from the incoming revenue streams from the White-Label product and the calculation of these figures was arbitrary and carried-over from legacy decisions. It also meant that because of existing accounting practices within the company that the token revenue was ultimately treated as a cost at the end of an accounting cycle rather than the foundation on which the accounting system is built. For that we needed the integrator dashboard.

In plain English?

For the White-Label SaaS application 3% of the primary and secondary market order value of each NFT minted on the protocol will be required as fuel and for the Digital Twin this will be a fixed fee of $0.02 per-NFT.

You can now consider (as-in the ‘enforced by smart contracts’ sense) the Digital Twin service fees to be a subset of the White-Label fees and these will be accounted for separately to the full service fee that would be charged for the White-Label, all in GET. This gives us the opportunity to align and set the future long-term direction of product separation and responsibilities between the entities that exist today, GET Protocol Foundation and the DAO. Each entity has different roles to play in the success of the protocol and has different strengths and weaknesses and although still being worked on, we’re starting to build a migration path to allow development work on the decentralised technologies within the protocol under the umbrella of the DAO. More to follow.

Each interaction will split fuel based on core Digital Twin costs and white-label revenue. Both the DAO and GET Protocol Foundation will receive revenue from this split.

What happens to the GET fuel?

Well, for starters, more fuel will be used. A lot more. And we need to have a strategy on how to capture value as efficiently as possible. For those that are familiar with us already you’ll know that we’re aiming to build a long-term vision for the future of ticketing and as a result we need a sustainable business to do that. This means some of this GET will need to be sold, and this should naturally be the case and expected - there are operational expenses to cover, staff costs, future hiring, and growth, and these things won’t happen if the fuel just piles up month after month. Nor is it the Protocol’s mission to only collect fuel, it will be reinvested, allocated, and spent for sustainability. The DAO would fall under the same realities and not all of the goals can be achieved by collecting and holding GET ad-infinitum.

The true elegance of this system however is that it gives us a lot of flexibility on the terms in which the fuel is sold and we’re exploring a system that will work in lockstep with future staking mechanisms, offering this collected GET fuel directly to committed stakers so we ensure that this GET remains in the hands of those that also share this long-term vision of the protocol. We may only be at the start of working out the finer details of these mechanisms but the idea is solid; we increase the demand for GET through its fuel usage, and we retain this value by ensuring we offer it to those that are in it with us for the long haul.

Take-away: All per-order ticket service fees & revenue will be settled in GET, which equates to 3% of a White-Label order value and $0.02 per Digital Twin ticket.


There’s a lot to digest but if you’re still with us and want to know more then there’s the following resources available:

And if you’re interested in connecting your own ticketing system to GET Protocol then get in touch to get set up.

The GET Protocol Foundation

Since 2016 we have been building ticketing infrastructure powered by the latest technology to upgrade the experience for all in the ticketing chain.

Ticketing companies of all sizes use our infrastructure to get clearer insights, generate greater revenue and maximise the connection through their tickets.

Digital Twin

Get involved in NFT ticketing easily and without any risk!

Learn About Digital Twin

White-Label

Isn't it time for a modern ticketing solution?

Learn About White Label

A parting note

Keep up to date with GET Protocol on social media via our Twitter.

© 2021 GET Protocol Foundation

Products

White-LabelDigital TwinDeveloper HubCommunity FAQ