Why does the decibel increase by the addition of 10 but the intensity increase by a factor of 10?
I can't understand your question but if I interpret it to mean what I think it does here is my answer:
Decibels measure the ratio of a given intensity 'I' to the threshold of hearing intensity(I-naught) , so that this threshold takes the value 0 decibels (0 dB). You get that? To state it more succintly, decibels is a function of intensity.
If the intensity increases by 10 then you have decibel=10*log(10/1) which equals 10*1 which equals 10. So the decibel rating goes up by 10 even though the intensity increased by a factor of 10 from 1 to 10.
But this situation only works if the intensity is 10 times greater...lets suppose the intensity of the sound we're hearing is 100 times greater than our threshold of hearing intensity. Then you have decibel=10*log(100/1) which equals 10*2 = 20 decibels.
In the end this is just a property of how logarithmic scales work. You typically use logarithms when numbers get so large that it's easier to just consider their
orders of magnitude. Imagine if you didn't use a logarithmic scale. Then if whispering had a value of 15 then normal talking would have a value of 150,000. A scale like that would be difficult to have an intuitive feel for.
It's all in the equation. Decibels is a made up system that is defined as the logarithm of the intensity of the sound to the threshold of human hearing intensity.