If your TV vendor decides to only put 100Mb cards in their TV then unfortunately spikey boy wins and you lose unless you’re willing to downrez your AV catalog.
Venn diagram of people who understand this specific technicality and people who don’t want to deal with the shitty TV software is almost a circle though.
I’d rather get a Android box at the very least…, or just HTPC.
I’m in that Venn diagram but I’m married with kids and the UX of anything but the TV remote and Plex software is a bit much for me to convince the family to learn. And potentially relearn when I find the next great app like jellyfin 😅
I think there’s another circle with at least significant overlap between those two of family techies who just can’t convince the rest of the family to care.
My wife and kids found Jellyfin easier to use because it more closely resembles Netflix. Your mileage may vary but I get it, and it’s why I even use a media server over just plugging in a laptop with Kodi.
Sometimes the best solution is whatever you can get the users to actually use.
That’s one solution… unless someone wants to use the computer while you’re watching something, it’s fine. For any shared access TV/computer set up, this falls apart quickly.
I want my SO to be able to watch something on the TV while I’m playing a game though (and vice versa). Personally all of my stuff is independent, we each have a gaming computer, and the TV ruins separately of all of it. We have a Samsung smart TV and it has a Chromecast attached, so we have options there… but not everyone is set up like me.
Nobody’s using this computer except me and nobody uses it for media except during group nights so it’s no problem. Technically it has a PlayStation hooked up to it that could be used for DVDs/Blu-rays but that never happens.
The only real benefit to HDMI over DVI is that it carries audio where DVI does not, which is why it’s used on so many TVs. I know DP can do audio too; so I’m not even going to touch on that. DVI however, can do dual-link, which IMO, makes it a much better video format regardless of any patent nonsense.
I have a 4k blu-ray remux of Misery that has a 104 Mpbs bitrate. But there are only a couple of movies in my collection that break 100. Most of my remuxes are around 50 to 70.
Anyhoo it’s all moot in terms of network speed since I just use a htpc to play all of them.
I have plenty with higher bitrate audio that can hit 80. And with the overhead of the rest of the connections, and possibly just some limits on the chipset for TCP overhead etc, it starts stuttering around that 80mbps limit.
For reference: 2x channels of 16-bit 48KHz raw uncompressed PCM audio (ie “perfect except maybe the noise floor under veryvery specific circumstances”) is about 1.5MBit/s. Even if you go 96KHz 6 channels (5.1 setup) 24bit uncompressed PCM then it’s only 14MBit + overheads.
The audio isn’t 80Mbps, the entire file is. The audio is TrueHD7.1, though. I probably don’t need it but I haven’t bothered transcoding it yet because I’m not exactly out of space or bandwidth.
My canon ink tank type printer from mid COVID era is the same, didn’t realise it was only 10/100 on the wired port until I was looking at the switch one day and wondered why I had a yellow light instead of green, was about to run a new network cable until I checked the printer
I get it too, but it was a bit of a shock given that the selling points for everything is bigger better faster stronger, otherwise why would people upgrade. It’s like finding something with a micro USB port on it instead of type c
Or to 10mbps, half duplex. I’ve witnessed this. My former company was trying to sell a client a new server because it was too slow when I noticed it was only operating at 10/half, instead of the 1000/full that both it and the switch was capable of. Some testing later, and the problem was at the server side cable termination, a quick re-termination and they were up to gigabit. Grabbed a spare run to the switch and connected another cable after verifying it was good and the company went from 10M/half to a LAG of 2000/full in the matter of about an hour.
Is that why my shit keeps buffering any time I try to stream a movie larger than 50-60 GB, despite the fact that I have a gigabit connection and a 2.5Gb router? TIL. BRB, running some speed tests on my TV…
If your TV vendor decides to only put 100Mb cards in their TV then unfortunately spikey boy wins and you lose unless you’re willing to downrez your AV catalog.
Venn diagram of people who understand this specific technicality and people who don’t want to deal with the shitty TV software is almost a circle though.
I’d rather get a Android box at the very least…, or just HTPC.
I’m in that Venn diagram but I’m married with kids and the UX of anything but the TV remote and Plex software is a bit much for me to convince the family to learn. And potentially relearn when I find the next great app like jellyfin 😅
I think there’s another circle with at least significant overlap between those two of family techies who just can’t convince the rest of the family to care.
My wife and kids found Jellyfin easier to use because it more closely resembles Netflix. Your mileage may vary but I get it, and it’s why I even use a media server over just plugging in a laptop with Kodi.
Sometimes the best solution is whatever you can get the users to actually use.
I set up an hdmi-Ethernet converter and run Ethernet between my TV and main desktop. It solves problems.
That’s one solution… unless someone wants to use the computer while you’re watching something, it’s fine. For any shared access TV/computer set up, this falls apart quickly.
I want my SO to be able to watch something on the TV while I’m playing a game though (and vice versa). Personally all of my stuff is independent, we each have a gaming computer, and the TV ruins separately of all of it. We have a Samsung smart TV and it has a Chromecast attached, so we have options there… but not everyone is set up like me.
Nobody’s using this computer except me and nobody uses it for media except during group nights so it’s no problem. Technically it has a PlayStation hooked up to it that could be used for DVDs/Blu-rays but that never happens.
They do that shit on purpose. Use a shield or an htpc. Only input your TV should be getting is HDMI.
Hell no! Only DVI or DisplayPort. No money to patent trolls!
From a signaling perspective, they’re very very similar. Given that all TVs have HDMI, it may be the only option.
DVI? Yes, basically HDMI is DVI guarded by patent trolls. DisplayPort? No, it is packet-based.
The only real benefit to HDMI over DVI is that it carries audio where DVI does not, which is why it’s used on so many TVs. I know DP can do audio too; so I’m not even going to touch on that. DVI however, can do dual-link, which IMO, makes it a much better video format regardless of any patent nonsense.
What the hell are you watching that has a bitrate of >100Mb? Because unless you have a 16K television I suspect the answer is nothing.
I have a 4k blu-ray remux of Misery that has a 104 Mpbs bitrate. But there are only a couple of movies in my collection that break 100. Most of my remuxes are around 50 to 70.
Anyhoo it’s all moot in terms of network speed since I just use a htpc to play all of them.
I have plenty with higher bitrate audio that can hit 80. And with the overhead of the rest of the connections, and possibly just some limits on the chipset for TCP overhead etc, it starts stuttering around that 80mbps limit.
80MBit/s audio? How?
For reference: 2x channels of 16-bit 48KHz raw uncompressed PCM audio (ie “perfect except maybe the noise floor under very very specific circumstances”) is about 1.5MBit/s. Even if you go 96KHz 6 channels (5.1 setup) 24bit uncompressed PCM then it’s only 14MBit + overheads.
The audio isn’t 80Mbps, the entire file is. The audio is TrueHD7.1, though. I probably don’t need it but I haven’t bothered transcoding it yet because I’m not exactly out of space or bandwidth.
Discovered this on a laptop after running the cable. Wifi was getting 250mbps vs 10/100 speeds
A TV I mean why not, but on a Laptop? Is it from the nineties O_o ?
My canon ink tank type printer from mid COVID era is the same, didn’t realise it was only 10/100 on the wired port until I was looking at the switch one day and wondered why I had a yellow light instead of green, was about to run a new network cable until I checked the printer
I guess you have to have a very particular workload, and printer, to need a gigabit line…
Right?
Casually printing highway ad posters at home, nothing special
Printers really don’t need even 100mbps though. They’re just not fast enough to spit out the prints your sending even at those speeds. So I get it.
I get it too, but it was a bit of a shock given that the selling points for everything is bigger better faster stronger, otherwise why would people upgrade. It’s like finding something with a micro USB port on it instead of type c
Could be something wrong with a cable? A damaged cable can downgrade your connection from gigabit to 100mb
Or to 10mbps, half duplex. I’ve witnessed this. My former company was trying to sell a client a new server because it was too slow when I noticed it was only operating at 10/half, instead of the 1000/full that both it and the switch was capable of. Some testing later, and the problem was at the server side cable termination, a quick re-termination and they were up to gigabit. Grabbed a spare run to the switch and connected another cable after verifying it was good and the company went from 10M/half to a LAG of 2000/full in the matter of about an hour.
The speed complaints stopped.
Is that why my shit keeps buffering any time I try to stream a movie larger than 50-60 GB, despite the fact that I have a gigabit connection and a 2.5Gb router? TIL. BRB, running some speed tests on my TV…
It’s been 9 hours, how did it go?
I don’t understand how it’s acceptable for $2,000 TVs to have only 100 mbps ports, wouldn’t it only cost a few cents per unit to upgrade?