6th update of 2023: Putting finishing touches on new releases of just about everything

in HiveDevslast year (edited)

blocktrades update.png

Below are a few highlights of the Hive-related programming issues worked on by the BlockTrades team since my last report. I hope we’ll have a new release of hived, HAF, and various HAF apps and tools in about two weeks, but it has been a while since I’ve last made an update and before long these posts would just get to long to write if I waited longer. I’m also preparing to go to HiveFest, so I’m going to try to keep this post pretty terse with links to details about the work. I’d also like to point out this is only some highlights: there’s too many devs working on this now for me to report all the improvements (even some I’ve worked on personally aren’t mentioned).

Hived: blockchain node software

New features, optimizations, and bug fixes

New tests

Many new regression tests, related to private key leaks, and dedicated operation testing scenarios (limit_order2, transfer, transfer_from_savings, transfer_to_savings).

DevOps

Denser: modern replacement for Condenser

We’ve added several devs to this project since my last report. The current status of the UI can be tracked here: https://gitlab.syncad.com/hive/denser/-/wikis/Comparison-of-views-of-the-Denser-project-with-the-old-Hive-Blog

Clive: new Hive wallet with a text-based user interface

Clive is a Hive wallet written in Python that runs on your own computer (it is not a web-based wallet where the code comes from a remote server) so it is inherently more secure than web-based wallets. Currently there are two such wallets available and supported in the Hive ecosystem: 1) a command-line interface wallet (aka the CLI wallet) written using C++ and 2) a graphical interface wallet called Vessel (a JavaScript-based wallet).

Clive has been undergoing a lot of work: lately we made a lot of improvements and bug fixes and implemented support for an external CLI version (i.e. enables executing individual wallet commands from a bash script).

We’re releasing an initial version (deployed as a docker image) that supports transfer operations as a technology preview, with a full release to follow later this year. I’ll make a separate post about how to get and/or build the initial version soon.

Hive Application Framework (HAF)

HAF is a SQL-based framework for creating highly scalable and robust 2nd layer apps that operate using data from the Hive blockchain.

HAF bug fixes and test improvements

I’ve also been benchmarking HAF under various conditions including running on a server with only 32GB of memory here: https://gitlab.syncad.com/hive/haf/-/wikis/home

An “irreversible block only” version of HAF may be in the cards

I recently had an idea to provide an “irreversible blocks only” version of HAF (enabled by a simple command-line option on a regular HAF server), for performance optimization purposes.

For most blocks, such a server will only lag a couple hundred milliseconds behind a normal “reversible” version of HAF, due to OBI, and might be as much as two times as fast for queries (just my guess at this point, pending actual benchmarks).

A HAF app would work on either configuration of the HAF server without any changes. It would basically be like a global switch that automatically converted all the HAF apps on that server to “irreversible block only” apps.

Docker compose scripts for easy deployment of API nodes

We’re creating docker scripts and documentation for best practices for setting up an API node. The idea is to enable setting up an entire API node with just two or three commands and an environment config file to bind map node-related storage to various locations in the local file system. These scripts will be located in a new repo in the Hive group.

As part of this work, we’re developing and testing health checks for the various apps, especially the new HAF apps, and much of this work will be re-usable for future HAF apps.

We’ve also been testing routine maintenance needs such as shutting down and bringing back up various subsystems on the API node and we’ve found and fixed several bugs during this process, as well as made various performance improvements.

As a best practice, we’re strongly recommending the use of ZFS for deployment of future API nodes: in fact, the docker compose scripts assume the API node will be configured and maintained on a ZFS filesystem and we expect considerable extra setup and maintenance effort will be required for anyone who wants to avoid ZFS.

One of the key drivers for selecting ZFS is we plan to regularly provide ZFS snapshots containing hived/HAF synced to the latest Hive blockchain headblock (originally we were using pgdump/pgrestore to provide filled HAF databases, but we found that ZFS snapshots were a much better solution).

ZFS compression also dramatically lowers the disk storage requirements needed for a full API node. This approach also allows us to provide API node operators with a suggested optimal partitioning of the storage of the node in terms of space and performance needs.

This default docker compose script to setup an API node will launch dockers for:

  • hived to connect to the hive network (currently this docker also contains the HAF database server, but in the future these will be deployed as individual dockers)
  • HAF to act as a database for Hive blockchain data received from hived
  • haproxy for load balancing of API calls
  • jussi for routing, caching, and legacy processing of json-based API calls
  • varnish for caching of new REST-based API calls
  • various HAF apps: HAFAH API (for Hive account history info), block explorer API and UI for lookup of general blockchain data, and Hivemind API (for Hive social media applications). A HAF app is usually deployed as several dockers. For example, for the block explorer, there is one docker to run the app’s main event loop on the SQL server and another that runs the postgresT docker to serve REST calls made to the database.

The addition of varnish is due to another point I should highlight: for future APIs, we're mainly adding REST-based APIs as we found that this led to superior performance from PostgresT API servers.

HELpy: Hive Execution Layer for Python, coming soon

We’re also developing a new general purpose python-based library for Hive called HELpy, but it is still in an early stage of development. Initial uses in our projects include integration in Clive and test-tools (in test-tools it will replace calls to the old cli_wallet). After it is successfully used there, it should be tested well enough for general usage.

Ongoing and upcoming tasks

  • New release candidates of hived, haf, and haf apps for 1.27.5 (likely in two weeks).
  • Finish docker compose scripts to ease deployment of API node infrastructure (this will be part of the new release).
  • Finish up initial version of HAF-based block explorer backend and GUI (also part of the new release hopefully but that is less certain).
  • Integration of keyauth state provider provided by HAF into HAF block explorer.
  • Continue work for initial Denser release
  • Continue work on Consensus State Providers (for more powerful HAF apps).
  • Add support for more operations to Clive wallet.
  • Collect benchmarks for a hafah app operating in “irreversible block mode” and compare to a hafah app operating in “normal” mode (low priority).
  • Publish more documentation for various new tools (beekeeper, Clive, consensus state providers) and separate HAF documentation into smaller, more easily digestible chunks.
  • Benchmark some of the recent performance improvements in HAF and HAF-based apps.
  • Deploy an updated version of HAF to our publicly-accessible HAF server.
Sort:  

Hmm Looks like I'll not be able to use the new deployment stuff unfortunately. ZFS on my current server will eat my SSDs lifespan. I burnt through 6% on brand new enterprise SSDs in the space of a few weeks. =X

Granted never had an issue setting it up the old fashion way via the normal docs! :)
The new simple deployment will be great for app creators who what to be up and running without fuss.

Hmm, just FYI, we've been running on ZFS on nvme SSDs for a couple of years now without any issues. We pretty much exclusively work on SSDs for our databases.

Yeah, It may very well simply be the current implementation of ZFS on Proxmox(my current server OS) isn't that great as that's where I hear the most horror stories about peoples SSDs getting shredded. Especially if they are consumer grade ones.
That and my lack of knowledge on ZFS in particular I just opted for LVM raid grouping which seems to have solved my issue for the time being.

I see. We work pretty much exclusively on Ubuntu, and I've no experience at all with Proxmox.

I can say that in my own personal experience, I found administration with ZFS infinitely more friendly than LVM, to the point that I began pushing all our admins to use it even though they mostly had experience with LVM previously.

I agree ZFS is so simple to use and get going. I'd honestly prefur to use it, especially versions new than that available on Proxmox are said to be leaps and bounds better.

I honestly never really thought to simply use ubuntu as my host system. Figured it'd always be easier to use something that has a nice interface for making VMs and managing the host like proxmox, unraid, various platforms like that.

Really new update make more secure hive wallet.

Exciting stuff!

Thanks for all you do for Hive. It gets better everyday!

Your post is truly extraordinary

Though I probably understand less than half of the stuff being mentioned, I appreciate the updates! 😁

All the best to the ongoing / upcoming projects! 💪

thanks for all the work!

Amazing work. Thanks for the updates

Thanks for the update

Congratulations @blocktrades! You received a personal badge!

HiveFest 8 Attendee

You can view your badges on your board and compare yourself to others in the Ranking

Congratulations @blocktrades! You received a personal badge!

Thank you for your participation in the HiveFest⁸ Meetings Contest.

We truly hope you enjoyed HiveFest⁸ and it's been our pleasure to welcome you in Rosarito.

See you next year!

You can view your badges on your board and compare yourself to others in the Ranking

If you folks make something and call it HAFAHTANK, I'd be impressed for the rest of my life.

Congratulations @blocktrades! Your post has been a top performer on the Hive blockchain and you have been rewarded with this rare badge

<table><tr><td><img src="https://images.hive.blog/60x60/http://hivebuzz.me/badges/toppayoutweek.png" /><td>Post with the highest payout of the week. <p dir="auto"><sub><em>You can view your badges on <a href="https://hivebuzz.me/@blocktrades" target="_blank" rel="noreferrer noopener" title="This link will take you away from hive.blog" class="external_link">your board and compare yourself to others in the <a href="https://hivebuzz.me/ranking" target="_blank" rel="noreferrer noopener" title="This link will take you away from hive.blog" class="external_link">Ranking<br /> <sub><em>If you no longer want to receive notifications, reply to this comment with the word <code>STOP <p dir="auto"><strong>Check out our last posts: <table><tr><td><a href="/hive-106258/@hivebuzz/hivefest-2023-contest"><img src="https://images.hive.blog/64x128/https://files.peakd.com/file/peakd-hive/hivebuzz/48RsQoSLNVTmubFoZqLYmt4Bv1fA3Z8bBBKVPoykGnuFjiYwX9MD6MpBHoSkVuZPdc.png" /><td><a href="/hive-106258/@hivebuzz/hivefest-2023-contest">HiveFest Meetings Contest

hey, I just contacted Chainge Finance and asked about a possible integration with them. This would be a big step imo for Hive entering the defi space. But one requirement seems to be EVM compatibility. Do you think Hive can get this with the smart contract upgrade soon?

The work we're planning will not enable EVM compatibility. There's been some other people who talked about such possibilities, but I suspect it would be quite a lot of work, so I an not really sure it is work worth doing.

I see, I guess the question is how we can open up Hive to the defi space in general. It would be great to see some integration in that regard.

The 2nd layer is definitely the best place for that in my opinion. The work we're doing there should make it much easier to integrate defi apps at the 2nd layer.


I've been writing at Hive for seven years, and I've always thought of this community as a model of justice. That's why I'm sad about this situation.@blocktrades, congratulate you. I need your help. Recently my post has been devalued and I don't know who to turn to. I wrote and asked a question to @curangel to explain why this happened. But I didn't get an answer. Do you know how I can solve this issue?

My friend thanks for The explication, ando for all.
I love your work.

Congratulations @blocktrades! Your post has been a top performer on the Hive blockchain and you have been rewarded with this rare badge

<table><tr><td><img src="https://images.hive.blog/60x60/http://hivebuzz.me/badges/toppayoutweek.png" /><td>Post with the highest payout of the week. <p dir="auto"><sub><em>You can view your badges on <a href="https://hivebuzz.me/@blocktrades" target="_blank" rel="noreferrer noopener" title="This link will take you away from hive.blog" class="external_link">your board and compare yourself to others in the <a href="https://hivebuzz.me/ranking" target="_blank" rel="noreferrer noopener" title="This link will take you away from hive.blog" class="external_link">Ranking<br /> <sub><em>If you no longer want to receive notifications, reply to this comment with the word <code>STOP <p dir="auto"><strong>Check out our last posts: <table><tr><td><a href="/hivebuzz/@hivebuzz/recovery"><img src="https://images.hive.blog/64x128/https://imgur.com/mDUAyyk.png" /><td><a href="/hivebuzz/@hivebuzz/recovery">Rebuilding HiveBuzz: The Challenges Towards Recovery