Gaming -> Intel i5 6500T (HD 530 gpu) vs i5 6260U (Iris 540 gpu)

Cann you tell me why i5 6500T has a lower price on ark.intel.com since it's quad core with higher cpu freq. Which one of these (Iris 540 vs HD 530) would perform better in games?

I have two choices: Dell 7040 (i5 6500T) and Intel NUC6 (i5 6260U).

Other urls found in this thread:

youtube.com/watch?v=Dw-FT1LQytg
twitter.com/AnonBabble

Get better choices.

U is fucking shit

iris 540 seems to perform pretty decent. youtube.com/watch?v=Dw-FT1LQytg

Unfortunately I am stuck with these two and I have to pick one :)

why

Why are you limited to either a $600 Optiplex or a $300 NUC?

I get one of these for free :) and I must use it not sell it and buy something else. Back to topic iris 540 or hd 530?

>I have two choices: Dell 7040 (i5 6500T) and Intel NUC6 (i5 6260U).
>GAYMING

Get the dell and get a proper GPU for it, like a GTX 750Ti

try to get the tower version of the Dell

the Iris 540 will be much better in games, but it's like comparing "this was out of date 10 years ago" vs "this was out of date 8 years ago". Those years are not pulled out of my ass, the Intel GPU's are about equivalent to the top end of those years. (8800 GTX and GTX 285 respectively)

so yeah, Dell + a cheap GPU; something around the GTX 750 Ti / R7 260X or GTX 1050 / RX 460 range.

If it's a SFF Dell and you're tech-savvy enough to be able to remove it from its case, add a video card, and throw it in a cardboard box, I'd go that route. It's a faster CPU. If you can't and this is just basically a console... I'd first look into reviews of the NUC and see if there's any signs of throttling under heavy use. If it's fine, its +~30% GPU speed over the HD 530 will probably mean more in games than the slower CPU.
If you can get the mid-tower Dell, definitely go for that.

why doesnt anyone game with the xeons? they're cheap as fuck and benchmarks are insane.

What Xeons? There's like 50 million of them and people do game on them.

my goto recommendation for people who don't OC is the E3-1230 so you're preaching to the choir here

You mean the old S771 xeons? Those are ancient now
The newer ones only work in expensive main boards

no, like the E5-2665, etc.

will they not work in any LGA 2011-v3 board?

Only in boards with microcode for it

Somewhat related question

Is the i5-7500 really a plain, straight upgrade over the 6500? I'm currently looking for a new processor, and they are almost at a similar pricetag, yet the 7500 seems much better in every category.

Is there some hidden ruse?

U is just as many threads as T and is 1/3 the power and heat for this gen

The old ones have shit IPC and are worthless for games and the new ones offer no advantages over the core i7 series

Kaby Lake is virtually a Skylake stepping, the only difference is a VP9/H.265 decode that can handle 4K. Skylake was intended to have this but it was disabled due to a bug.

That literally doesn't matter due to GPU's already supporting that shit though.

oh and it's 200mhz faster

oh and the 7500 won't work with Windows 7 or 8.1; because Intel and Microsoft explicitly won't support them.

oh and they won't overclock, the 6500 will overclock on certain boards with a specific bios

>simultaneous multithreading being on par with full cores

>triple the heat for double the cores *and* no HT

Kaby Lake works fine with windows 7 and 8, it's later on when support officially gets moved along leaving 7 behind entirely

Skylake 25W U processors got upgraded to 15W U processors. The 5w line is becoming comparable with the lower end 15w line. Not everybody has a GPU from the current gen and that decoding is important for not loading 100% CPU just trying to play 4K YouTube. And of corse the actual DRM on Kaby lake that makes GPU-decoding impossible for certain, commercially used codecs, you can't just bypass that with a pascal gpu
Kaby Lake was a big step for mobile CPUs

>the 5w line is becoming comparable with the lower end 15w line
No, they're just calling the Y series "core i5's and i7's" to fool people into buying chromebook CPUs
5w chips are blue here, the others are 15w

I'm seeing nothing like that, and I'm directly comparing the full mobile Skylake and Kaby Lake lines right now. All I'm seeing is, as I said above, they're about 200-300mhz faster.

And you're right, that decode is good for mobile users. Too bad it was intended to be on Skylake, and was disabled because it was broken. Totally worth a whole media hullabaloo and renaming though.

not that it matters to the point you're making but chromebooks come with U series processors, not Y

lol

"Chromebook" is generally just an overall term for any cheap smartphone that got stretched out to 15"

Look harder, significantly higher clocks, significantly lower tdp

not seeing it mate

>3.6GHz dual core i7 that can be powered off a single AAA battery
This is no laughing matter

15w'ers by comparison

the second TDP is what they'll use when it's cooled properly or power isn't an issue btw. Third is the power they'll use when in a low state.

Except they fucking won't. That bugger won't hit 3.6ghz unless it's a single threaded application that isn't loading the GPU at all, while it's plugged into a wall, and before it starts to bake itself.

Still looking at the wrong ones
Excuse my iposting

My 7500U uses 16 watts peak (according to coretemp) that I've seen running aida64, at 3.5Ghz

>apple user focused on the fact it's labeled an i7
really fires up the neurons


did you know the lead designer behind the Apple CPU you're currently using got hired for AMD to be the lead designer behind Zen?

>before it starts to bake itself
A 5 watt tdp is not hard to cool, even for a fucking tablet

yes, while plugged in, cold, and not using the GPU

Yeah, I'm pissed they're selling their 1.8GHz 5 watt tablet CPUs as "core i7's", even though the performance is catching up (not caught up) to the U series, it still is a ploy to grab money from anybody who think their laptop can play games because it recommends a core cpu
Are you not?

>on battery
>after playing runescape for an hour (which got it to 50° already)
>with CPU-based GPU processing (even more CPU heat than using a discrete GPU)

Why sure, keeping it from overheating is pretty easy. Keeping it cold on the other hand....

remember, heat transfer is linear with the difference between temperatures (simplified model)

That being said, I'm fucking impressed with battery life on this thing
I don't know if it's because of Kaby lake, but this can run benchmarks for hours, it can shitpost for 10 hours straight, it's something I can wake up and not have to worry about ever bringing a charger to work, and it's thinner than any laptop I've ever even considered owning. Battery life and the weight of having good battery life were huge laptop problems imo, and this solved that shit with no compromises I can see