Without numbers or access to the program, we can't tell you too much. Do you know how long it takes to do 1kb? How about 2kb? Does it grow linearly with the size of the input, or is the curve different? What about memory footprint?
(Google for "big-oh" if this is unfamiliar territory.) The actual math is uninteresting, but of course, once you understand the concepts and have some basic metrics to use for the calculations, it's a quick multiplication or two to get a real-world estimate in minutes or hours.
For example, I'm working on a program which needs to compare each input to all collected previous inputs. As the number of previous inputs grows, the running time will grow by one unit for each run. That's linear growth.
Now, I have a different program which needs to reorder all the inputs it remembers. So when a new input comes in, all the inputs need to be recompared against each other, in order to put them in the new correct order. That one doubles its running time each time a new input comes in, because the number of comparisons required doubles.
The
O(n) mumbo-jumbo is just a convenient way to talk about these things, it takes about half an hour to get it at first but it's time well spent.