Rate Limit Positive Rebalancing Proposal

Title: Rate Limit Buy (Positive) Rebalancing Over Time

Proposed Rate Limit: 20% of bnUSD supply to be minted through rebalancing per day.

As discussed here:

Reasoning: It would soften the upper level of the rebalancing threshold and provide borrowers with some certainty.

How it works:
Based on 10,000,000 bnUSD supply:

Format:

First line:
(value of collateral in bnUSD), (value of loan in bnUSD), (LTV)
Second line is when rate limit reached (20% more bnUSD)
(value of collateral in bnUSD), (value of loan in bnUSD), (LTV)

20% 2 M bnUSD minted through rebalancing would effect these positions this way:

1000, 300, 30%
1060, 360, 33.96%

1000, 350, 35%
1070, 420, 39.25%

1000, 450, 45%
1090, 540, 49.54%

Please discuss and make suggestions. Lets get this right.

This topic is a little confusing, so Iā€™ll make an example

Parameters:

  • sICX value at start $1
  • Total bnUSD at start 10,000,000
  • Price of sICX drops to $0.90
  • 5% Positive Rebalancing threshold
  • Rate Limit of 20% Positive Rebalancing is hit
  • Total bnUSD when rate limit hits 12,000,000

Leslie, Ron, Tom, and Andy all have 10,000 sICX collateral at the start. They all have a different amount of bnUSD loan.

Leslie:
10,000 sICX collateral ($10,000 value)
1,000 bnUSD loan
10% LTV

Ron:
10,000 sICX collateral ($10,000 value)
2,500 bnUSD loan
25% LTV

Tom:
10,000 sICX collateral ($10,000 value)
3,500 bnUSD loan
35% LTV

Andy:
10,000 sICX collateral ($10,000 value)
4,500 bnUSD loan
45% LTV

  • Price of sICX drops to $0.90
  • bnUSD is valued at $1.06
  • Positive Rebalancing is happening until the rate limit of 20% is reached, No more Positive rebalancing until the next day
  • 2,000,000 bnUSD is minted by borrowers and used to buy sICX that is added to their collateral.
  • sICX price for rebalancing is 0.86 bnUSD (as it is bought with overvalued bnUSD, so the price to borrowers is lower than the market price of $0.90)

Leslie:
10,232.55 sICX collateral ($9,209.30 value)
1,200 bnUSD loan
13.03% LTV

Ron:
10,290.70 sICX collateral ($9,261.63 value)
2,750 bnUSD loan
28.69% LTV

Tom:
10,406.98 sICX collateral ($9,366.28 value)
3,850 bnUSD loan
41.10% LTV

Andy:
10,523.26 sICX collateral ($9,470.93 value)
4,950 bnUSD loan
52.27% LTV

Edit: Now, imagine this happens in the last hour of the day, and during the next hour, the first hour of the next day, positive rebalancing continues.

  • Price of sICX drops to $0.80

  • bnUSD is valued at $1.06

  • Positive Rebalancing is happening until the rate limit of 20% is reached, No more Positive rebalancing until the next day

  • 2,400,000 bnUSD is minted by borrowers and used to buy sICX that is added to their collateral.

  • sICX price for rebalancing is 0.86 bnUSD (as it is bought with overvalued bnUSD, so the price to borrowers is lower than the market price of $0.90) (for day 2; I kept the rebalancing price high as the price of sICX was dropping, to create harsher effect)

Leslie:
10,511.62 sICX collateral ($8,409.30 value)
1,440 bnUSD loan
17.12% LTV

Ron:
10,930.23 sICX collateral ($8,744.19 value)
3,300 bnUSD loan
37.74% LTV

Tom:
11,069.18 sICX collateral ($8,855.34 value)
4,620 bnUSD loan
52.17% LTV

Andy:
11,374.66 sICX collateral ($9,099.73 value)
5,940 bnUSD loan
65.28% LTV

In the Perfect Storm scenario, of Rate Limited Positive rebalancing happening in a short time (at the end of day 1 and start of day 2), the change on LTV for the four borrowers looks like this:

Leslie:
10% to 17.12% LTV

Ron:
25% to 37.74% LTV

Tom:
35% to 52.17% LTV

Andy:
45% to 65.28% LTV

Discussion Points:

  • Is 20% the right amount for rate limiting?
  • Higher, lower?
  • Going too low might hurt the peg (and value as a stable product) of bnUSD.
  • Going too high might not provide any support for borrowers.
  • With larger price drops, the change in LTV is exasperated.
1 Like

I think in general rate limiting or otherwise controlling the buy side of rebalancing can be very flexible and freeform the moment we arenā€™t welded to the notion of the upper peg.

I still believe the important thing is a time delay, instead of a maximum amount, this applies to both directions of peg. If someone trades an astronomical sum of sICX, in either direction, it shouldnā€™t cause rebalancing immediately, it should taper its way in.

And the way it would be done is similar to a maximum amount, but that amount ā€˜rebalanced so farā€™ reduces as time goes by either a fixed rate, or a rate that increases the wider the peg, or any other exotic formulae you want to apply. So there would still be a cap, but instead of that cap emptying every day, it would empty as time passed.

This makes a lot of sense, essentially some sort of trailing average. Iā€™m thinking incremental upgrades starting with an asymmetric peg the DAO can manage and a hard-daily limit of rebalancing

@benny_options @arch
I like where you are going with this, but I donā€™t know the maths of how to implement it.

I made a spreadsheet earlier today, using 1.177 as the upper peg and having ā€œbuyā€ rebalancing happen at 0.5% of LTV an hourā€¦. But I donā€™t know if itā€™s very helpful.

After reading the posts, rate limiting of some sort should be able to soften the peg on the lower end too, but just with a tighter threshold.

1 Like

We will have to change the bad debt retiring mechanism if we decide to go with a 17% upper peg limit.
Who will want to retire their bnUSD worth 1.17 USD, if theyā€™re only getting 1.10 USD worth of sICX for retiring it?

1 Like

The implementation would be relatively simple, you have a formula based on blocks for how much it would decay by, and a static method that would tell you how much cap would be available if you called the method now.

When the method is called, find the blocks since last successful rebalance, work out the decay over that time, compare with the incoming rebalance request to see if its allowed or not, and then execute or reject.
As long as that value is reable directly, or trivially computed, anyone calling it can submit with the right amounts. The only additional information needed would be the block height of the last rebalance.

1 Like

@arch any chance youā€™d be interested in taking a crack at the specific formula?

The Rebalancing contract is almost done being translated to Java so I think rate limiting and asymmetric peg could be voted on and implemented shortly after.

1 Like

First post in pseudo code.
This is a rough design that looks right to me, hopefully someone can poke holes in it for any gaps. Iā€™ll also put a BigDecimal code snippet that I think is right, but without any blockchain specific context. Probably get an expert to weigh in on Math.pow, AFAIK there are decent approximations for arbitrary x^y power calcs that may help, as the default implementation of BigDecimal.pow doesnā€™t support non integer exponents.

Rebalance Limit(rebalanceLimit): Limit in bnUSD of allowance for rebalancing
Rebalance Tally(rebalanceTally): Amount rebalanced so far
Rebalance Half Life(halfLifeBlocks): Time in blocks for rebalance tally to halve (exponential)
Rebalance Rate (linearRebalanceRate): Decay In allowance per block (linear)
Last Rebalance Time (lastRebalanceHeight) : Block height of last rebalance

Rebalance Limit should be an amount setable by governance, either a direct fixed amount or a set % of debt.
If rebalance Tally is more than Rebalance limit Rebalancing is blocked.
Count halflives since last rebalance.
Multiply Tally by the decay which is 0.5(half) by the power of the number of halflives.
The parameters of halftime and then multiplying by a power of half(0.5) is arbitrary, but makes it easier to visualise.
Those 2 parameters as long as in sync can be anything like time to fall to 0.25 of original and then using powers of 0.25 etc.

Iā€™ve also added a linear function if we prefer.

rebalanceLimit = getRebalanceLimit()
rebalanceTally = getRebalanceTally()
//Early exit
if(rebalanceTally > rebalanceLimit) return

blocksSinceLastRebalance = currentBlockHeight - lastRebalanceHeight

//EXPONENTIAL DECAY
//halfLifeBlocks in blocks
numberOfHalfLives = blocksSinceLastRebalance / halfLifeBlocks 
decayMultiplier = (0.5) ^ numberOfHalfLives
newTally = rebalanceTally * decayMultiplier

rebalanceHeight = currentBlockHeight
rebalanceTally = newTally

//LINEAR DECAY
newTally = rebalanceTally - (blocksSinceLastRebalance * linearRebalanceRate)

//Execute rebalance
if(newTally < rebalanceLimit) {
       amount = rebalanceLimit - newTally
	doRebalance(amount)
	rebalanceTally = newTally
	rebalanceHeight = currentHeight
} else {
	rejectRebalance()
}

EDIT2: Fixed formatting

   public Object rebalance(BigDecimal amount){
        BigDecimal rebalanceLimit = new BigDecimal(0);//get from governance
        BigDecimal rebalanceTally = new BigDecimal(0); //getFromDict
        
        //Early Exit
        if(rebalanceTally.plus(amount).compareTo(rebalanceLimit) > 0){//currentRebalanceTally > rebalanceLimit
            return new Object(); //Return datastruct
        }
        
        BigDecimal currentBlockHeight = new BigDecimal(0); //getFromDict
        BigDecimal lastRebalanceHeight = new BigDecimal(0); //getFromDict
        BigDecimal blockDifference = currentBlockHeight.subtract(lastRebalanceHeight);
        
        BigDecimal halfLifeBlocks = new BigDecimal(0); //get from governance
        BigDecimal decayedRebalanceTally = getExponentialDecayTally(rebalanceTally,blockDifference,halfLifeBlocks);
        
        //ALTERNATIVE LINEAR METHOD
//        BigDecimal linearDecayRate = new BigDecimal(0);
//        BigDecimal decayedRebalanceTally = getLinearDecayTally(rebalanceTally,blockDifference,linearDecayRate);
        
        if(decayedRebalanceTally.plus(amount).compareTo(rebalanceLimit) > 0){ //currentRebalanceTally > rebalanceLimit
            //Rebalance not allowed
            return new Object(); //Return datastruct
        }
        
        doRebalance(amount);
       //Probably done in the previous `doRebalance`, just here to emphasise the steps
        saveRebalanceTally(amount); 
        saveRebalanceHeight();  
        return new Object(); // Success data struct
    }
    public BigDecimal getExponentialDecayTally(BigDecimal initialTally,BigDecimal blocksDelta, BigDecimal halfLifeBlocks){
        BigDecimal numberOfHalfLives = blocksDelta.divide(halfLifeBlocks,BigDecimal.ROUND_HALF_UP);
        BigDecimal decay = BigDecimal.valueOf((Math.pow(0.5,numberOfHalfLives.doubleValue())));  //BigDecimal doesn't have a pow implementation with non integer exponents
        return  initialTally.multiply(decay);
    }
    public BigDecimal getLinearDecayTally(BigDecimal initialTally,BigDecimal blocksDelta, BigDecimal linearDecayRate){
        BigDecimal decayAmount = linearDecayRate.multiply(blocksDelta);
        return  initialTally.subtract(decayAmount);
    }

Edit: Forgot to account the amount of rebalancing being attempted
Edit2: Should have read sample code more closely, looks like we only have BigInteger? Iā€™ll let someone better ponder it, but I think these operations donā€™t need absolute precision. I still need a way to do division though.

2 Likes