Broccolinisoup

Javascript developer learning Ruby: Integer vs. Float Division

I am learning Ruby these days after working with JavaScript for a long time. I am finding many interesting differences between the two languages and I am hoping to share my aha moments here, in these series. I hope you find them useful and interesting.

JavaScript: Everything is a Floating-Point Number

In JavaScript, all numbers are floating-point (IEEE 754 double-precision), meaning division always produces a decimal when expected:

console.log(90 / 100); // 0.9
console.log(5 / 2); // 2.5

Even when working with whole numbers, JavaScript handles division as floating-point math.

Ruby: Integer Division Can Bite You

Ruby has separate types for integers (Integer) and floating-point numbers (Float). If you divide two integers, Ruby performs integer division (truncating the decimal):

puts 90 / 100  # => 0  (Integer division!)
puts 90.0 / 100  # => 0.9 (Float division)
puts 5 / 2  # => 2 (Integer division)
puts 5.0 / 2  # => 2.5 (Float division)

Fix: Force Floating-Point Division

To ensure a decimal result, make at least one number a float:

puts 90 / 100.0  # => 0.9
puts 5 / 2.0  # => 2.5
puts 5.fdiv(2)  # => 2.5 (another way)

Other Key Differences Between Numbers in JS and Ruby

  • JavaScript has NaN, Ruby does not. NaN (Not a Number) exists in JS, but in Ruby, invalid math usually raises an exception instead. 0 / 0 gives NaN in JS, but 0/0 in Ruby raises ZeroDivisionError.

  • BigInt vs. BigDecimal : JavaScript has BigInt for large integers (123456789012345678901234567890n). Ruby has BigDecimal for precise calculations (often used in finance).

  • Different methods for rounding: JavaScript: Math.floor(2.9), Math.ceil(2.1), Math.round(2.5). Ruby: 2.9.floor, 2.1.ceil, 2.5.round.