5th update of 2022 on BlockTrades work on Hive software

in #hive3 years ago (edited)

blocktrades update.png

Below are highlights of some of the Hive-related programming issues worked on by the BlockTrades team since my last post.

Hived (blockchain node software) work

Optimization of Resource Credit (RC) calculations

Changes were made and tested to add extra cost of custom operations that are used for RC delegations (while these aren’t strictly consensus, they do impose more costs on a hived node than other custom operations).

Testing and fixing bugs in new wallet_api code

Found and fixed some more bugs in wallet_api code. I think this task will be closed out soon.

Testing the new p2p code with testtools-based testnets

We fixed a minor race condition in the new sync code that got exposed during testing: https://gitlab.syncad.com/hive/hive/-/commit/c62803aa6a1627771ee0f05f95710154263a843c

Testing of the new p2p code also exposed a latent bug in the fork_database code: the fork database’s copy of the head block didn’t get updated properly when a fork switch occurred. This could cause problems for the new p2p code during syncing, because it now uses this copy of the head block instead of the one in chainbase to reduce mutex contention on chainbase. Now the head block is properly updated during fork switches: https://gitlab.syncad.com/hive/hive/-/commit/5c0b7fc0e290859d6d1809234a2c87cedecc760c

During this testing, we also found a longstanding problem with the way the mutex locks that protect access to critical resources were being handled: many of the read locks include a timeout parameter (i.e. if a mutex read lock wasn’t obtained within 1s, the lock failed and the code would have to deal with this failure) and it turns out that these locks performed with a timeout can actually fail for no reason (even before the timeout time expires) and required that the code using the lock explicitly check the results of the lock attempt (checks that weren’t performed). This didn’t happen very often (maybe once in a million times), but over long periods of time this has no doubt resulted in occasional unexpected failures inside the code.

To fix this problem, by default, read locks are now “untimed” (they will block until they get the lock) and only the API server uses locks with timeouts (these calls are allowed to fail and using timeouts prevents API calls from taking up too much of the chainbase’s access time and potentially starving the critical execution of the write_queue that is writing blockchain data to chainbase). We also replaced the use of boost::interprocess locks with standard locks, as the locks are only used within a single hived process and there didn't seem to be a need for these presumably more expensive locks. This work was merged in here: https://gitlab.syncad.com/hive/hive/-/merge_requests/401

Mirrornet (testnet that mirrors traffic from mainnet) to test p2p code

While trying to setup the mirrornet to test the new p2p code, we found some further problems in the mirrornet code, and these are currently being fixed.

Once they are fixed, we’ll resume attempts to test the p2p code under heavy loading conditions, but in the meantime we decided to rely on the “tried-and-true” method of throwing the new code into our production environment (watching it carefully, of course) to test the above mentioned locking changes that were made.

We also exercised the new locking code using the new API benchmark/stress tests that we used to test account history nodes while the node was providing sync data to another node. Neither test exposed any bugs or performance regressions.

Completed initial design of block finality protocol

We completed our design for the new code to improve block finality time and we’ve begun implementation of the new code (sometimes with distractions to work on other tasks, unfortunately). I’ll write some more on this topic in a separate post once we’ve proved out the design more.

Hive Application Framework (HAF)

Filtering of operations using regexes to allow for “small” HAF servers

I believe the code for using regexes to filter operations is complete or nearly so, and tests are now being developed for it, but I forgot to get an update today on the status of this work, so I’ll update on this point tomorrow.

Benchmarking of alternative file systems for HAF-based

We’ve continued to benchmark HAF running in various hardware and software configurations to figure out optimal configurations for HAF servers in terms of performance and cost effectiveness. Among other things, we discovered that the location of PostgreSQL write-ahead logs (by default written to /var/lib/postgresql) can have a significant impact on the time it takes to reindex a HAF database (note that this is separately specified from the location of the HAF database itself).

HAF account history app (aka hafah)

We implemented and tested the changes I mentioned last week to create a new index in the HAF database to speedup get_account_history calls (and probably other similar future calls as well that may be needed by other HAF apps).

We’re now looking to see if we can speed up the performance of the next biggest bottleneck API call (get_ops_in_block), but performance of this call is already acceptable if we can’t further speed it up.

In order to speed up our optimization work, we also made a higher-level script to eliminate some of the manual steps that were previously required to perform a benchmark of HAfAH and get useful analytical data from the benchmark (this will probably be committed tomorrow). This script may serve as a useful starting point for other HAF apps looking to benchmark the performance of the app's API calls.

Hivemind (social media middleware server used by web sites)

We continued to work on conversion of Hivemind to a HAF-based app with a slight detour: currently the code is being reviewed for possible improvements in coding style to meet best practices for Python.

We also updated the ujson package used by Hivemind because of security concerns about an older version of the package.

And finally we merged in an old optimization we made to processing of custom_json operations.

HAF-based block explorer

We're in the early stages of developing a HAF-based block explorer (open-source, of course). Two of our newer developers are getting introduced to the associated concepts and also reviewing HAF documentation as part of this work.

Condenser (source code hive.blog and several other Hive-based web sites)

We’ve also been reviewing and merging in updates from @quochuy for condenser and we have a few more to merge in during the coming days (some fixes for Hive Authenticate and improvements to deter phishing attempts).

What’s next?

  • Modify the one-step script for installing HAF to optionally download a trusted block_log and block_log.index file (or maybe just allow an option for fast-syncing using a checkpoint to reduce block processing time now that peer syncing process is faster and may actually perform better than downloading a block_log and replaying it). This task is on hold until we have someone free to work on it.
  • Test filtering of operations by sql_serializer using regexs and account name to allow for smaller HAF server databases.
  • Collect benchmarks for hafah operating in “irreversible block mode” and compare to a hafah operation in “normal” mode. Task is on hold until we’ve finished basic optimizations of HAfAH API.
  • Further testing of hafah on production servers (api.hive.blog).
  • Finish conversion of hivemind to a HAF-based app.
  • More testing of new P2P code under forking conditions and various live mode scenarios and in a mirrornet testnet using only hived servers with the new P2P code.
  • Complete work on improving block finality time.
  • Complete work on resource credit rationalization.
  • Continue benchmarking of HAF and Hafah on ZFS and EXT4 file systems with various hardware and software configurations.

I’m pushing the current expected date for the next hardfork to May, given the large number of testing and performance benchmarking tasks still facing us and a couple of key functional tasks still to be completed (RC rationalization and block finality improvement tasks).

Sort:  

without going as much into details as a full post, what's block finality protocol ?

A communication protocol for speeding up in the rate at which blocks (and more importantly, the transactions inside them) can be considered final/irreversible (not subject to reversion in fork). This is the "secret" feature I've mentioned in the past :-)

Aaaah I see :D Looking forward to it !

smart contracts?

Unrelated

ok. No idea what block finality protocol is :D

Recent blocks in a chain can be reversed. So you can see them as an indication of what transactions will occur but can't rely on them (for example to perform a trade). Currently in Hive it takes about a minute for blocks to become final where the transactions can be relied up. @blocktrades is apparently working on a way to speed that up.

Very interesting.

Was there any problem in the past with that?

Not a problem. Would be an improvement.

HAF-based block explorer will be game-changing and herald a new revolution.

I do think it will enable very versatile and even customizable block explorers. One interesting aspect of this is I hope to build it in a way that the block explorer can easily be taught to understand the meaning of 2nd layer transactions specific to 2nd layer apps (games, defi apps, etc) and potentially report state associated with those apps.

The very basic thing to interact with web 3.0 or similar applications most of the time is an explorer. So you have rightly conceptualized it to make it decentralized and to offer a menu choice for people who love decentralization in their day-to-day endeavors. I appreciate your wisdom and effort in this regard. More power to you.

stuff becomes ultra efficient.

If I understand it right, it would make a solution ( in terms of speed and index) from light wallets easier.

I mean for Example use EVM chain wallets to transact on hive via mother wallet ( Coustom transaction) on multisig wallet.

like wallet xyz buys splinterlands card. Transaction on hive + ownership verification.

I mean the only puzzle that would additional needed is a way to very the wallet ( like import the asymmetric cryptographic data / public keys to allow a transaction and verify it).

For basic custom transactions, it would be enough. And it would not need on the first level on hive, it could be Layer 2 operating + post into L1 via mother wallet or a public wallet).

With the customizable block explorer, it could be very efficient.

Screenshot_285.png

Howdy.
The image is just to highlight the comment a bit more.Use of the reply at the top to save you scrolling down an encyclopedia of content.

Thank you for the delegation support over the last year. I did not do as much with it as I should have. Reasons why don't apply. Life will always put hills in your journey.

It has provided insight and some minor confirmations of my theories for Hive and the tokens it provides. I will be in a much better position to begin a pilot revenue generator when HIVE value increases again.

Thank you.
I'd vote for your witness as a thank you. But that was already there. If there is anything I can do in assistance for you, I would be available.

thehive

ChatGPT explains this article as to a child:

"This article is talking about a team of people who are working on improving a computer program called "Hive" which runs a blockchain network. They are working on different parts of the program to make it work better and faster.

Some of the things they worked on recently include fixing bugs (which are like mistakes in the program) and making sure different parts of the program work well together. They also made changes to make the program run faster, and they are testing those changes to make sure they work.

They are also working on a new part of the program that will help it be more secure and efficient, and they are testing different ways to make the program run better.

Overall, the team is working hard to make the Hive program work as well as it can, and they are always looking for ways to make it even better."

Thank you @blocktrades. credit to all your invaluable work for the Hive blockchain! :)

Reko_3.png

Damn, y'all been busy... I managed to get a tree planted 🌳

The wind storms around here keep trying to knock down all the trees around our house. The oaks seem stable, but the pines are a bit worrisome.

Just wanted to say real quick, after 3 or so years of not using Blocktrades, I can confirm that it's still super awesome. Now you have to register to trade but it's simple, I think I only had to provide an email. Fast and easy-to-use exchange service. Keep it up!

We'll continue to keep it as simple as we can, but financial regulators keep imposing new rules. From what I've heard recently, I think we'll be required to collect identity information for all customers trading after June (currently we only have to do it for customers trading large daily amounts), but I need to double check the Cayman Islands regulations to ensure I didn't misunderstand.

Any news about identity information for all costumers after June?

Thank you for the regulatory update!

I know you guys have tried to keep our personal business our personal business, but you also have a business of your own to run. I am confident you won't voluntarily give up any more information than absolutely necessary and proactively disclose any privacy changes.

Love Blocktrades, glad to know it's only getting better 👍

How are you training the new two devs? I wanted to help Hive at a deeper level but I don't even know how to start

There's my post here: https://hive.blog/hive-139531/@blocktrades/a-developer-s-introduction-to-the-hive-application-framework

I've also had them do some work on two existing HAF apps in the Hive repo (balance_tracker and HAfAH). And the UI guy has been doing some reading and experiments with the existing Hive API (documented here: https://developers.hive.io/).

What kind of development you looking to do? Blockchain deve or application development? Perhaps more UI centric let @peakd know

I feel like Blockchain/protocol is going to need new devs because it is important and few people can do, but I am willing to help on the application level.

I work am working as a front-end dev so I could definitely try to help @peakd, how can I get started though?

Probably contact us on discord and we can send the correct application links.
https://discord.gg/zV9eTpNC5M

peakd.com has been trying to hire a full time person but we're certainly warming up to the idea of contracting developers for some side projects
For example we have been working with @imwatsi on HAF stuff that we can in turn integrate into PeakD.com and assume they'll also get integrated into many other interfaces.

We have several other side projects that could likely be done without having to need to worry about peakd.com code in general which makes it easier for people getting started but who are still pretty talented.

I've been asking around for somebody to create a ui that allows for account creation,swaps whatever for hive, hbd, hive power, and deposits into savings accounts.
Blogging can be mentioned as well as keys introduced if they care to click the links.
But, the primary functionality would be to get an account and deposit into savings.

Provided they get the owner key, they can learn more later, if they care to do so.
You already have the swap exchange, is it possible for you to spin up an interface and host it?

Will we get 'bonds' as described by taskmaster with the next hardfork?

Will we get 'bonds' as described by taskmaster with the next hardfork?

I like the idea, so I think it's a good possibility, unless someone else beats us to it at the 2nd layer in the meantime.

Any second layer solutions for bonds would have central points to fail?

This ui for creating accounts and allowing deposits to the wallet seems trivial to me, yet nobody has done it.
Am I missing something?

Any second layer solutions for bonds would have central points to fail?

For a 2 layer solution, you could setup a N-of-M multisig that works in a similar way to decentralization via witnesses at the 1st layer.

This ui for creating accounts and allowing deposits to the wallet seems trivial to me, yet nobody has done it. Am I missing something?

I'm not sure of exactly what you're asking for. Do you just want an account creation feature inside an open-source UI, as opposed to the current closed-source ones?

I would like a url that I can send people to that allows for account creation and deposits/withdrawls into/from savings.
Preferably from a number of coins to hive and back to a number of coins on the way out.

When we get bonds I'd presume that some people will want to have an interface that can deposit to the savings/bond features without having to worry about the blogging aspects of the chain.

Multisig would be important to larger/corporate users, I suppose.

A ui that allows for these features will facilitate easier entry into the ecosystem.
The new user gets their owner key and can skip the added complexity of all the keys and surviving the crab bucket of the pool.
Though information on the use of the active key should probably be included for security reasons.

I'm surprised this hasn't already been offered.
It seems to take the place of light accounts without much additional complexity.

For access to expanded features of the chain, the new user would have to use an alternative interface, such as ecency, peakd, or hive.blog.
The proposed interface would simply deposit/withdraw from savings once a user has created/entered an owner key.

I think the next HF will be in June, like one year since the last one. I like to play meme-stradamus sometimes :P

Sounds cool and interesting! Just signed up ;-)
Toi toi toi and greets from Vienna

I got lost right from the beginning.😥😥

The reason I tend to skip programming and Tech related posts.🙁

Who will explain in common language please?
I'm interested in knowing the latest change in the Hive blockchain.😊

Hi I'm looking forward to RC delegations, but i actually have a brief off topic question you can probably answer....

HBD inflation has increased from 3% to 10% recently (maybe going up further) but where does this extra HBD come from?

Is this just totally NEW HBD which is in addition to the 'Hive rewards pool' which goes to witnesses/ dao/ content creators and curators?

(In which case I would have to increase the overall inflation rate for HBD/ slightly if I was interested in working that out?)

It is additional inflation. Currently this is very minimal (about 0.1%) because only HBD held in savings gets interest and that is only around 3 million. If that were to increase too much relative to overall Hive market cap, witnesses would have to reconsider whether the level of interest being paid was appropriate.

Cheers, as you say ATM we are talking about such small amounts, I'm in favour of the 20% for now.

Congratulations @blocktrades! You have completed the following achievement on the Hive blockchain and have been rewarded with new badge(s):

<table><tr><td><img src="https://images.hive.blog/60x70/http://hivebuzz.me/@blocktrades/payout.png?202204091745" /><td>You received more than 1620000 HP as payout for your posts, comments and curation.<br />Your next payout target is 1625000 HP.<br /><sub>The unit is Hive Power equivalent because post and comment rewards can be split into HP and HBD <p dir="auto"><sub><em>You can view your badges on <a href="https://hivebuzz.me/@blocktrades" target="_blank" rel="noreferrer noopener" title="This link will take you away from hive.blog" class="external_link">your board and compare yourself to others in the <a href="https://hivebuzz.me/ranking" target="_blank" rel="noreferrer noopener" title="This link will take you away from hive.blog" class="external_link">Ranking<br /> <sub><em>If you no longer want to receive notifications, reply to this comment with the word <code>STOP <p dir="auto"><strong><span>Check out the last post from <a href="/@hivebuzz">@hivebuzz: <table><tr><td><a href="/hive-122221/@hivebuzz/pum-202204-6"><img src="https://images.hive.blog/64x128/https://i.imgur.com/R438YeH.png" /><td><a href="/hive-122221/@hivebuzz/pum-202204-6">Hive Power Up Month - Feedback from April day 6<tr><td><a href="/nftforpeace/@hivebuzz/nft-for-peace-feedback1"><img src="https://images.hive.blog/64x128/https://i.imgur.com/H5ptwaY.jpg" /><td><a href="/nftforpeace/@hivebuzz/nft-for-peace-feedback1">NFT for Peace - Feedback and new city<tr><td><a href="/hive-122221/@hivebuzz/pum-202203-delegations"><img src="https://images.hive.blog/64x128/https://i.imgur.com/fg8QnBc.png" /><td><a href="/hive-122221/@hivebuzz/pum-202203-delegations">Our Hive Power Delegations to the March PUM Winners <h6>Support the HiveBuzz project. <a href="https://hivesigner.com/sign/update_proposal_votes?proposal_ids=%5B%22199%22%5D&approve=true" target="_blank" rel="noreferrer noopener" title="This link will take you away from hive.blog" class="external_link">Vote for <a href="https://peakd.com/me/proposals/199" target="_blank" rel="noreferrer noopener" title="This link will take you away from hive.blog" class="external_link">our proposal!

Hello friends of @blocktrades

I would like to ask if there was any error in the voting of my publication? Because the vote was made with 0.3% of voting power. I look forward to your prompt response, thank you very much.

https://peakd.com/hive-145796/@reinoldroberts/ccblacsu

Thanks for the updates :D

The new changes made in Hive excite us, the writers. I think that with the development of technology, new changes and developments will occur.

Can you add an operation to automatically claim HBD interest if one hasn't claimed in a long time.
For example in six months or something.

Now that the apr is increasing it can accumulate more.

image005.png

https://hive.blog/hbd/@dalz/hbd-interest-or-claimed-vs-earned

Thank you very much.

img_0.4052725604173449.jpg

I understand very little about the blockchain, what I know is that Hive has changed my life, especially mine, hoping to continue sharing with my Violin in my content.

Good job for improving the platform.

Hey there BT, I was wondering, is there something the Devs can do to stop or prevent trolls from creating lists of 2,000 accounts and making 10+ notifications per day in your feed?

You can make a mute list and mute the accounts, this will prevent them from showing up in your notifications (or anywhere).

Hey there @blocktrades I am wondering how I can report a fraudulent account? We created an account back in 2016 for our Thirsty Entertainment project. It was registered and paid for in my name from accounts that I created. thirsty-entertainment thomas-k and thomas-kohi. The person who changed the passwords was Raymond Johnstone, a volunteer on the project who turned sour after he realized there was no immediate payments until the project got funding, which is never did. How can I go about recovering those accounts or at the very least closing them so he cannot use them to siphon funds to his other projects? Thanks in advance. I have blockchain transaction screenshots for proof.

If the keys were changed in less than 30 days, you may be able to do an account recovery. Check into Hive discord, someone there can probably assist you in real-time.

question i accidentally did over the limited do i get my money back if it failed?

I wanted to transfer 432 HBD to LiteCoin, but it has failed. What was the reason of this failure? The LiteCoin address was OK, I've already used it with success last year. Can I get back my HBD amount or you convert it?

Thankyou for info

What is blocktrades all about?

Great job and amazing

Nice Post indeed looking Towards it.

Hey there! I am trying to use @blocktrades .us but, even if I am approved with my identity, I cannot trade nor deposit Hive tokens. Can you help me?

We have seen that you have previously supported proposals for the improvement of Hive.

We would love to have your support in our proposal that seeks to add a new tool to the Hive environment that will help ensure that your content continues to prioritize quality and above all originality.

You can vote here:
Peakd
Ecency

We would appreciate your support, criticism, and collaboration. Thank you for considering this proposal.

I'm impressed with the Up-Votes you receive... Way to go...

Wow, excelente publicacion, muy bueno lo explayado, te dejo mi voto y te dejo, saludos.