What does this describe, the automation of how authoritarian regimes already operate? There's no fundamental shift, here, just the ability for the state to shift manpower to other tasks. There's not even a guarantee that the algorithm is more effective than manual review, only the expectation that the savings in manpower would outweigh the costs of a less efficient system.
Digital automation is rarely more efficient in itself, only more cost-effective. It's more likely that any serious move towards "social credit" is understood by the leadership as the modernisation of the existing surveillance-state rather than something fundamentally new. They will present it as such to the public, but that's because they're trying to present it as something other than a surveillance system.
I agree that this is nothing fundamentally new. The mass surveillance by digital means is not going to produce much better data than ten officers of the secret police following someone around. But the gain in efficiency enables the state to massively extend the surveillance. Without automation it is simply impossible to control everyone. If they have a highly automated system, they can at least attempt to do so. It remains to be seen, how successful that is, but I don't think the possibility can be outright dismissed that some sort of social control will be established.
But, people aren't stupid. The Chinese populace are reported to be pretty cynical about their government, because they have to live with it, and know how inefficient and corrupt it can be, how far it acts as a vehicle for factional interests rather than as a coherent force, whether or not that force is applied for the public good. If the Chinese government is going to convince its populace that its new systems are capable of producing any coherent relationship between action and outcome, they are going to have to ensure that there is usually such a correspondence, that a failure to achieve correspondence appears as the exception, and it's absolutely unclear that the PRC has the means to do that. It's not as if people really trust the credit system, either, they just treat it as a great, dumb beast that has to be appeased.
Ah, but appeasement is exactly the goal here. Even if the people don't think too highly of it, as long as they comply, the government has a tool for social engineering. If the people suspect that doing X lowers their chances to get into university, get a car, a promotion or whatever, they might refrain from doing it, no matter what they think of the system. There needs to be some legitimacy because there is only so much arbitrariness the people will accept. But as long as many people think that noncompliance will result in future disadvantages, there will be many people who will comply. Just look at how many people comply with the credit score system, no matter what they think of it.
Most of that data is noise. That's a problem which has bedeviled security agencies for centuries: the majority of intelligence is empty. Human agents are rarely in a position to filter the useful from the irrelevant on the spot, and it's even less probable that an algorithm would. The archives of security agencies across Europe are full of trivial minutiae, which agents had to spend as much time filtering as they did collecting. Perhaps there's a further algorithm that can help the filtering- but those results will themselves need reviewed and filtered, and those results will need filtered, and there's no real guarantee of anything useful coming out of that process. As above, it's really just a way of automating existing processes, the value of which lies in cost-effectiveness over efficiency. Any serious repressive measures are still going to revolve around the direct surveillance of known dissidents.
The ability to collect massive amounts of data has been available for some time now and the filter methods were not able to keep up. Nut in recent years, there has been much progress in pattern recognition, so that filtering will become much easier in the future. And even if that just leads to increasing cost efficiency, this alone will open up more opportunities. Especially with minor offences there has always been somewhat of an enforcement gap, where the resources required to effectively control the issue would have been larger than the benefits of controlling it. Reduce the required resources and suddenly there is a much better case for stringent enforcement. How many times have you violated minor traffic rules and got away, because there was no police office around to punish you. Now imagine surveillance cameras with image recognition trained to recognize such violations and an automated system to give you a fine. If the false positive rate gets low enough, the state can enforce such rules much more rigidly.