No, I can't. It's not public material (in contrast to Geekbench scores). I gave as much backing as I could - there are classes of tasks where apple's chips are so efficient (uarch + ISA) that they outgun Haswells per clock. Computational tasks. That combined with apple chips' TDP makes them prime candidates for distributed workloads of said classes. A high-clocked* 2x A8 @ $150 comes at no competition in that space especially when it comes to budget installations. It's as simple as that.
Erm, you need power efficiency for most computationally-heavy tasks. One of the reasons GPUs pummel CPUs at GPGPU - you can always throw more CPUs to match the power of a GPU, but the GPU is still more efficient.
* Still sub-2GHz, I guess.
Well it's hard to discuss performance and usage needs without details. I'll just have to take your word for it. It sounds to me like a highly specific edge case though but hard to say without knowing more detail on the requires of the simulation.
I don't believe anyone said it was the perfect or best box, but rather, it has the brightest future. I would agree with that, given Apple's chops in software and their developer base.
And you did say the Fire TV was the better solution than the Apple TV, which I don't agree with because of my own preferences. I don't care for XBMC or side loading Android apps, like you do. Both decisions are perfectly acceptable.
But it's kind of an empty statement to say it has the brightest future. You might as well say that about any product Apple releases because of their position in the market, the money and resources they have, their clout, and their history. Everyone was saying Apple was going to show how to do watches right and that they had the most potential and it's been lukewarm since that got announced and released. Sure Apple's position gives a lot of potential and I would definitely acknowledge that which is why I've been highly interested in an Apple TV since I got my first Apple TV2. But again, potential is meaningless if it never realizes just like the PlayStation TV. By all accounts, the PlayStation TV should have been great with all its potential and Sony dropped the ball on that so hard. Not to say Apple is going to blunder it like Sony, but potential only goes so far.
I also said I think the Fire TV is in a better position right now. Not forever, right now. Like everyone said, the Apple TV4 is just a few days old. It hasn't even stretched its legs yet, but that doesn't mean Roku and Fire TV aren't more mature and a better box at this point in time. In theory the Fire TV should have a ton of potential too if you want to use potential as a measuring stick, but that potential of universal search in Netflix still hasn't panned out in a year and is a sticking point to you. The CEO of Netflix even said it was coming shortly a year ago. It would have sucked if voice search in Netflix was a huge deal for you and you bought it because the CEO of Netflix said it was coming within a few months which turned into over a year and nothing. That's why banking on potential isn't always great. Why would you disbelieve Amazon and Reed Hastings that it was coming?
Clearly Plex doesn't make sense in your particular case, but so others don't get scared off it needs to be pointed there are numerous situations where it could be useful for them.
Plex's client / server model was certainly a product of its time. There were a number of streaming boxes hitting the market that simply did not have the processing power to do software decoding of HD content. So the idea of a breaking XBMC in two so you could have a transcoding server made perfect sense. And even today, there can still be issues on devices like the Fire TV. For example, high bit-rate 1080i deinterlacing can still struggle a bit on Fire TV Kodi, so people utilizing networked tuners or DVR content may still need a transcoding backend.
For people on the cutting edge, I'd say we are getting towards the end of its necessity on streaming boxes. In a few generations I suspect we'll be at a point where they can handle any HD playback. So for that reason I think it makes sense for Plex to start doing a hybrid approach; clients that have the processing muscle should start doing software playback. Of course the reality is there will still be plenty of people that are slow to adopt. Not everyone is like us and buys new streamers every year.
Even then however, it still doesn't mean the server transcoding doesn't have use. What's gonna happen with UHD? I suspect we'll have a nice period where a PC backend still makes sense for a large contingent. The bigger point though being what about mobile? Sure phones keep getting faster and faster, but until there is a major breakthrough in battery life you still would be running into battery drain issues from software decoding.
More importantly though ... what of bandwidth? Plenty of people use Plex to stream over a WAN (or even mobile). Running full bitrate, even if you can decode it on the other end, can be problematic. Whether it's a phone, tablet, browser or streaming box, if you want to access it outside of your home there's plenty of reason to transcode it. Some people have bandwidth caps ... some people have slow upload speeds ... plenty of locations you may access your content on the go have crap download speeds. The list goes on and on.
I've been upfront about this before; one of my biggest gripes about Plex is how they took XBMC and then charged money for some functionality. That has always bugged me since the beginning and add on the limitations and transcoding and it was just never something that I took to since I started with XBMC and it felt in every step of the way that Plex was XBMC-lite. Even when a unified database started becoming more appealing to me, I started to look at Plex with fresh eyes again, only to find out I could do the same thing in XBMC and it only took 5 to 10 minutes to set up. So I've never seen the benefit for me. I understand it's strengths, and a lot of that is the ease of setup, but I've always felt you could get more out of XBMC over Plex in the end.
That said, one of my biggest gripes is going away now that they're ditching Kodi as the core. I'm willing to take another look at them when they switch over and how they evolve with their new core to see if there's something to gain. I'm still in search of the perfect box because nothing has met my requirements, and possibly nothing will, but XBMC on the Fire TV was huge step forward that other box offered me in that quest and they were stupidly cheap to boot with sales being as cheap as $60 and offering me functionality my HTPC couldn't.
Honestly I'm just like a dog chasing a car with this whole DD+ quest. DD @ 640k sounds more than fine to my ears, I've just heard tell that DD+ is roughly twice as efficient and bests DTS-HD @ 1.5Mbps (i.e. not MA).
There isn't much DD+ content out there, but I rip my BluRays from whatever lossless format they are into FLAC and from there use Handbrake and/or ffmpeg transcode the video/audio it down to a size more suitable for archival and generally more compatible with things like my iOS devices and now this Apple TV.
I'm slightly confused. Do you have media that natively has DD+ or are you taking lossless audio and reencoding it to DD+? If it's the latter, what kind of data size gains are you getting. If it's the former, what has DD+? The only thing I've ever seen trigger DD+ is Netflix for me.