random question , does anyone have any recommendations for neflix shows/documentaries that have information about Dental medicine or dental care in general?
Off topic, but I like to watch Arrow, breaking bad, cut throat kitchen, walking dead, how I met your mother, and baby daddy on net flix.
Other than that, I don't think there isn't really any information about those kind of documentaries. I would try YouTube or subscribe yourself to dental town's newsletter.
The complete series of Friends is available on Netflix.