The confidence interval is more related to alpha (indirectly) and the size of your standard deviations than power, but yes confidence intervals do decrease with increasing n. However, this isn't the definition of precision. Precision is more related to your standard deviations of your sample. If your sample is all over the place (even if in reality the population standard dev is small) because of poor quality measurements then the high standard deviation will increase the size of your CI.
Assuming your methods are perfect, increasing sample size will increase accuracy because as n increases it reaches N, the total population size. Once n=N, your accuracy is 100%. However since methods are rarely perfect, you won't ever reach a true 100% accuracy, but it still will increase with n.