You should multiply the numerator to accommodate the number of digits you need, perform the division and then divide with normal floating point division.
(Run in browser that supports BigInt, like Chrome)
var a = 12340000000000000000n;
var b = 1000000000000000000n;
console.log(Number(a * 100n / b) / 100);
By only converting to Number at the "end", you will lose the least precision.
More precision
If you need more than 16 digits precision and need decimals, then you'll need to throw your own implementation of a kind of BigDecimal
API, or use an existing one.
Here is a simple one using BigInt
as its base type, combined with a configuration that determines how many digits (from the right) of each such BigInt should be interpreted as decimals (digits in the fractional part). That last information will for instance be used to insert a decimal separator when outputting the number as a string.
class BigDecimal {
constructor(value) {
let [ints, decis] = String(value).split(".").concat("");
decis = decis.padEnd(BigDecimal.decimals, "0");
this.bigint = BigInt(ints + decis);
}
static fromBigInt(bigint) {
return Object.assign(Object.create(BigDecimal.prototype), { bigint });
}
divide(divisor) { // You would need to provide methods for other operations
return BigDecimal.fromBigInt(this.bigint * BigInt("1" + "0".repeat(BigDecimal.decimals)) / divisor.bigint);
}
toString() {
const s = this.bigint.toString().padStart(BigDecimal.decimals+1, "0");
return s.slice(0, -BigDecimal.decimals) + "." + s.slice(-BigDecimal.decimals)
.replace(/\.?0+$/, "");
}
}
BigDecimal.decimals = 18; // Configuration of the number of decimals you want to have.
// Demo
var a = new BigDecimal("123456789123456789876");
var b = new BigDecimal( "10000000000000000000");
console.log(a.divide(b).toString());
Again, this needs a browser that supports BigInt
(Chrome at the time of writing).