Assume a machine has a throughput of 70 ops/s. Say we have an algorithm with time complexity T(n) = 2n^2 + 120
Determine how many hours will it take the machine to execute said algorithm for an input of size n = 8.
Is this correct?
T(n) = 2n^2 + 120
T(8) = 2(8)^2 + 120 = 248 * 70 * 1000
Please help.