Yes, it's just a model. But "The Scientists" will not be running just one model once and then presenting that as fact - that's just how it appears in the media to simplify it.
They will run their model(s) lots and lots of times.
They will adjust for different parameters.
They will constantly retrospectively compare their modelling outcomes to what actually happened, and use that data to try to make future predictions even more accurate.
They will model in uncertainty.
They will use supercomputers because of the huge amounts of data to be processed.
The models will always give a range of possibilities.
If something concerning keeps coming up with a relatively/ unacceptably high percentage of probability (e.g. a very large spike in infections across the country) then that means some sort of extra action might be needed.
I would have thought that with more damaging outcomes, they might start getting concerned at slightly lower percentages - is a 20% risk of a large spike happening too high, or is it acceptable? What about a 30% risk, or 50%, or 70%? Someone will have had to decide what level of risk is acceptable, and what level is not.
So the models are predicting that with Track and Trace operating at existing levels, the chance of a large spike is high enough to be concerned about.
If Track and Trace were more efficient then the risk of a large second surge in the winter would be low enough to be acceptable.