A Third Reich drama series is a no brainer. You got your violence, political intrigue, womanizing and romancing, scandals, great uniforms, etc. etc. yet they haven't pulled the trigger on it.
There's like 10 million documentaries on Nazi Germany/WW II and people are still :sayword: to their TV sets when a new one drops. A Game of Thrones/Mad Man type drama would be
Throw in the real life backstabbing between the Allies and you got a surefire hit.
Better than another Vampire series.
They've squeezed all the fukking life out of the Zombie/Vampire genre. It's OVER, DEAD (pun intended). Please, can we do something else now? 
There's like 10 million documentaries on Nazi Germany/WW II and people are still :sayword: to their TV sets when a new one drops. A Game of Thrones/Mad Man type drama would be
Throw in the real life backstabbing between the Allies and you got a surefire hit. Better than another Vampire series.
They've squeezed all the fukking life out of the Zombie/Vampire genre. It's OVER, DEAD (pun intended). Please, can we do something else now? 

Probably would be great tv, but it would be hard for most people to sympathize in any way and it would bring a ton of controversy and outrage. I can already envision the types of characters and dilemmas people could have in the series plus the story arcs and character development. Nazis are painted as cartoonish embodiments of evil when the real tragedy is people from everywhere around the world can all be lead towards evil through the need to cope with something and/or the need to have purpose in their lives. Plus of course, the classic line, "All that is necessary for the triumph of evil is that good men do nothing," obviously applies. Hard to fathom any remote possibility of anyone seriously bringing up the concept to tv stations globally.