>>54767252

DELETE THIS

Other urls found in this thread:

videocardz.com/60373/amd-polaris-tech-day-nda-ends-on-june-29th
hardforum.com/threads/from-ati-to-amd-back-to-ati-a-journey-in-futility-h.1900681/page-2
twitter.com/NSFWRedditVideo

>PUTTING PAJEET IN CHARGE OF YOUR GPUS

Wasn't HardOCP biased against AMD anyway?

>TL;DR
Website is butthurt they don't get exclusive intel/hardware from AMD anymore, so they decide to paste some nonsense together to make it seem like AMD is keeping shit low because their new cards are shit.

Did I miss any details after reading everything.

Kyle have a hard-on boner for hating AMD? Read his forum comments and you'll see. He's the editor-in-chief.

I wouldn't trust them to be fair in their articles.

All I'm seeing there is 'We got no free trip AMD is dum'

POO IN THE LOO
O
O

I
N

T
H
E

L
O
O

HARDONCP

>Full disclosure - HardOCP was not invited to this weekend’s launch in Macau as AMD PR has made a decision to no longer brief this site with the rest of the industry

Karma eh. These arseholes were always bashing ATI back in the day. Surprised that shitty site still exists.

> Let’s start with the tension. Koduri was able to wrestle control of the graphics division away during AMD’s last leadership transition after threatening to leave the ship and take a role at Intel, something he's not shy about telling his AMD colleagues. Lisa Su caved and Koduri got the job.

>Letting this Pajeet hold entire AMD company to ransom

TOPPEST LELKEK

Dlet this

>While Koduri is known to have a strong desire to do this by forging a new relationship with Apple on custom parts (no surprise there) for Macbooks, the real focus is on trying to become the GPU technology

>The Polaris 10/11 launch, and all of its problems, are set to become a future problem of Intel’s in what RTG believes will be a lucrative agreement that will allow Koduri and his men to slash the lines from Lisa Su and the rest of AMD.

>With Intel in the midst of a shake-up under their new chief product guy, [Dr. Venkata “Murthy” Renduchintala]


POO....
O......
O.IN...
..N....
....LOO
....O..
....O..

>he thinks I'll click his stupid clickbait

jesus christ user, you'll have to do better than this.

basically
>even pooinloo knows there's no hope for amd
>trying to rescue ati back from amd
>motivated by money, doomed to fail

GET.
POTTY.
TRAINED.

Just some rumors from anti-AMD editor. Can it be trusted? Probably not. He's real butthurt about all this.

I will sir, I'll be laughing at the 31st for you being wrong about AMD sir :)

Koduri is the Keller of graphics processors though.

>Koduri is the Keller of graphics processors though.

who invented this meme

> Ya think?There will be a few folks that call bullshit on this, but let's see how close I am to the mark when the smoke clears. That said, we do not publish articles such as this based on rumors, and never have.

He's risking the entire reputation of his decade old site and his main source of income on this.

>calling out amd
>anti-AMD editor

kek you amd shills are so batshit insane

watch out the boogeyman is going to burn your amd cards

lmao

Ha, the readers will forget about it in a month, while the clickbait will bring in plenty of revenue.

videocardz.com/60373/amd-polaris-tech-day-nda-ends-on-june-29th

Jokes on you, NDA ends on June 29th

AYYMD IS LATE AND IRRELEVANT

He'll just say "well this is what my sources say". His reputation is already being trashed due to his anti-AMD stance.

You don't need me to tell you that. You just have to read his forum comment history. If they're deleted, try google cache. There's plenty of evidence there.

POO IN THE AMD LOO
O
O

I
N

T
H
E

A
M
D

L
O
O

All of his post against amd are him calling out amd for being shit, which is a perfectly fine complaint.

>One just has to look at the uncomfortable interaction between the two on the stage at the recent Game Developers Conference "Capsaicin" event and the body language says it all.

Finally, someone who noticed this.

I swear, Sup Forums was blind to this. Not one meaty comment on it. You could tell Koduri was wearing out his welcome.

K Y L E
Y
L
E

That whole conference was horribly awkward and cringy. How could you tell?

AMD WILL BECOME SUPERPOWER BY 2020

If all his AMD posts are like that, then there won't be any point in giving him any AMD products for review. You already know the answer. Its shit.

If this is what excites you, then there are issues.

which just showcases how scared amd is of nvidia, nvidia doesn't cowers like that.


how weak.


meanwhile nvidia is full of confidence they threw 1080 to everyone for free to keep.

this is water cooler gossip.

You want him to lie?

>how weak
Children's board is that way.

No. Lying is an issue as well. The problem is, his truths are boring. Everyone believes in some sort of truth, if that truth doesn't change, then what is the point in wanting to hear more of his other truths?

Pathetic.

>someone calls amd for what they are
>go back too bawww


You're virtually the same.

What the fuck are you saying dude. He's a hardware journalist, not a comedian.

>I can already hear the fanboys ripping into my thoughts here yelling about the HardOCP's NVIDIA and Intel bias.

> Full disclosure - HardOCP was not invited to this weekend’s launch in Macau as AMD PR has made a decision to no longer brief this site with the rest of the industry.

kek the salt. It's kitguru all over again.

Journalists become comedians after they lose their credentials.

He was cheap.

This settles it then, he's shilling nvidia and bashing AMD new gen without any base.

>this is water cooler gossip.
fuck off

this is AMD, not water c ooling discussion

>How exactly did AMD try to buy HardOCP off?

>AMD's Chris Hook called me and offered an all expense paid long weekend trip out to San Francisco for my wife and I, and then the first exclusive interview with Raja Koduri and he would hold all press so HardOCP was first. I declined the offer.

hardforum.com/threads/from-ati-to-amd-back-to-ati-a-journey-in-futility-h.1900681/page-2

>missing the point entirely

loo

>his truths are boring

if the truth isn't painful, then it's not the truth

good news is actually lying

>hotter and slower than its competition's
85watt
85WATT
it's lower than OC'd cpu Its twice lower than nvidia and it means there's a massive room for clocking.

Nvidia is over.

RED ROSTER ON SUICIDE WATCH

No Pajeet you can not poo in it

Not all truths are painful. Truths that don't change for years become stale and boring. Truths that don't change in the face of different scenarios/events/time/place stay the same.

What do you think he would have rated if he gave a review on AMD Polaris? "Its shit". Vega? "It's shit" Unknown AMD that hasn't even been launched? "Its shit" (see fury series)

This guy deletes comments and bans people calling him out on his shit.

They're stuffing like a billion cores at 14nm. It doesn't matter if it's 85w or 5w. The heat will be so concentrated over such a small surface area that it won't ever dissipate properly.

It's why even the latest 14nm shit from Intel runs hot as hell even with direct die watercooling despite being their most efficient processor ever.

Emotions have no place in the eyes of objective data.

>AMD ruins everything

Pray tell, how is this news?

"watercooler talk" is a proverb, you moron.

Glad we agree.

Then shut the fuck up about him being boring.

You seem bit emotional. Did I rustle you that much?

Being emotional has no place in objective data.

Kyle, go back to your site. Sup Forums is a lost cause, they can't see the truth that you see.

POO
O
O

what is overclocked poo. diarrhea?

Telling fags off =/= u mad

thanks bb

>Let’s start with where we are currently. Full disclosure - HardOCP was not invited to this weekend’s launch in Macau as AMD PR has made a decision to no longer brief this site with the rest of the industry.

Holy fuck, you can feel the butthurt when the guy wrote that.

I don't have any preference from either AMD or Nvidia, but for the best bang for buck, but shit...this guy is MAD at AMD as hell it really makes me wonder if its pure hate or Nvidia did pay him to rage at them.

If im correct, there could be a lot of problems for him and the site isn't? He's saying a lot of bad stuff of a multi million dollar company and he doesn't have any proof...i do hope AMD to have a great 400 series launch just to see the backlash at the guy and the site.

Specially the Sup Forums reaction.

he's one of the view reviewers not afraid to speak his mind and has shat on both nvidia and amd in the past.

Kyle is way off the mark at times but I think he's probably being quite candid here.

GCN's failure is pretty much a direct result of AMD choosing to fire all the white engineers in favor of hiring over 1,000 employees on H1B visas to replace them. TeraScale was a much better architecture overall and would be destroying NVIDIA right now if they had stuck with it.

>VLIW
>post 2010

Come on.

>Where the plot thickens is when you look at the Koduri’s unwavering ambition. Koduri’s ultimate goal is to separate the Radeon Technologies Group from its corporate parent at all costs with the delusion that RTG will be more competitive with NVIDIA and become a possible acquisition target for Koduri and his band of mutineers to cash in when it's sold. While Koduri is known to have a strong desire to do this by forging a new relationship with Apple on custom parts (no surprise there) for Macbooks, the real focus is on trying to become the GPU technology supplier of choice to none other than Intel. While this was speculated some time ago I can tell you with certainty that a deal is in the works with Intel and Koduri and his team of marauders working overtime to get the deal pulled into port ASAP. The Polaris 10/11 launch, and all of its problems, are set to become a future problem of Intel’s in what RTG believes will be a lucrative agreement that will allow Koduri and his men to slash the lines from Lisa Su and the rest of AMD.

If what he's saying is true I think that bodes well for RTG in the future. But AMD might be done and cooked in the CPU business if this happens, considering AMD has relied on the revenue from GPUs increasingly since they lost their share in the CPU market.

That's real fuckin neato. Let's see some sauce.

I've have done my fare share of reviewing hardware in the past.

One thing is to speak of your mind, a thing that i was completely free to do (because all the hardware i reviewed was borrowed from a store, not from the companies themselves)

And another completly different is to just shat on them because they did something you didn't like, imagine if every page did that... Sup Forums would be the best source for news and reviews!
>It isn't

Kyle eiter REALLY has a very good indeed insider knowledge or is shotting arrows in the sky and hopping for the best, not thinking in legal implications, the image of the site and how the article itself will impact on other sites, on the AMD stock and so on...that's why im really interested in this.

he's not shitting on them because they did something he didn't like... he's shitting on them because AMD, a company that used to be hugely successful and renowned, has been mismanaged into the ground and looted by idiots. now they're trying to market subpar products to maintain relevance instead of putting out something actually decent.

there is a silver lining, however: if intel purchases RTG, we're going to have GPUs produced at intel's fabs, who is an entire generation ahead of every other fab in the industry.

>he's not shitting on them because they did something he didn't like
Why do you think this is the case? All evidence points to this, yet you're very sure of this not being the case. Why is this? Kyle has a history of temper tantrums.

>The heat will be so concentrated over such a small surface area that it won't ever dissipate properly.

This is offset by the fact that the 14nm transistors produce way less heat.

It has been like that since every node jump in computing history.

>It's why even the latest 14nm shit from Intel runs hot as hell

This is the FUD that Intel wants you to believe, when the truth is that they just put thermal paste between the silicon and heat spreader, instead of soldering it, in order to GIVE the processor thermal problems to discourage overclocking.

Because they haven't had a relationship with AMD since roy taylor and the nano fiasco.

wtf? Essentially what you're saying is that every single node shrink causes processors to become hotter. In addition, the heat doesn't scale linearly because this is a FinFet technology, that is, the gates have volume, so effectively it can accept higher currents, that's why they can push clocks so high so quickly on the node shrink for Pascal.

As far as the article goes, the whole "hotter and less power efficient" thing is all just a result of the rumor mill again. Not anything to pay attention to.

>TeraScale was a much better architecture overall and would be destroying NVIDIA right now if they had stuck with it.

Terascale was only useful for password cracking.

If they kept to it, then the Fury X would've had 16000 cores and perform the same as it does today.

>Let's get this out of the way: I'm EXTREMELY fanny-flustered and hate AMD and they hate me and have blacklisted me but this doesn't affect my article in any way I promise
lol nice try

terascale was performing on the level of nvidia's flagships in half the die area, if they had continued that level of gaymen performance they wouldn't be in the world of shit they are today.

>if intel purchases RTG, we're going to have GPUs produced at intel's fabs, who is an entire generation ahead of every other fab in the industry.

If Intel purchases RTG, they'll stop development of all their products so their own Intel GMA tech will not have any market opposition.

The trick is to water cool your AMD in a nice clean toilet. This is why AMD users shit in the streets, they can not have the card become dirty with feces.

People should reading Kyle's recent gpu reviews more closely. He has a nasty habit of suggesting a slower gpu is flatout better than a faster one because (in these scenarios) the faster card has been an AMD one.

The last time he pitted a 390 against a 970 the 390 was faster nearly across the board but he basically said "get the 970 because".

You can't just erase someone's anger/bias simply because it was on pause for a year.

And I very much doubt his anger with AMD disappeared without a trace. But you acknowledged that he had past issues with AMD.

Why wouldn't this negative AMD article not be born from his anger with AMD?

His reporting could very well be true, there is literally no way to verify the validity to his article right now. However, there is a validity to his anti-AMD stance. So given what's verifiable and whats not, we can toss out the unverifiable data and we're left with a verifiable data. That data suggest its simply a product of his relationship with AMD.

Am I missing something that you're seeing?

>letting the loo witch get her hands on your GPU
I suppose it could get infused with magic, though.

JEE PEE YOO IN LOO

delete this sir

>However, there is a validity to his anti-AMD stance.

what anti-AMD stance? pretty much every review of an AMD product he's done in the last 10 years has been positive, and he has partnered with AMD for marketing events in the past. if anything, it looks more like AMD has had an anti-reviewer stance in the more recent years due to their subpar product launches. hardocp was not the only site to get shafted by roy taylor.

They were also completely useless for anything other than gaming due to the nature of VLIW, where they put shaders in clusters of 4 or 5 (depending on version), where 1 core can do mostly anything, while the rest can only do one thing. This made them a joke for GPGPU tasks other than serial integer processing, eg. password cracking.

It was also from where the shit drivers meme comes from: the drivers for VLIW cards had to translate the DirectX shader calls in a way so they best utilize all the VLIW cores (instead of just every fifth fully capable core).

That was the entire reason why they switched away from VLIW and to GCN, in order to produce stronger cores which are more useful in GPGPU tasks and therefore have a significantly bigger market.

>mfw my 390x oc runs at 30 degrees idle and 61 degrees full load
>on air
the magic of case fans and a backplate

AMD is already a joke for GPGPU tasks, nobody uses their GPUs for stuff like that despite them being better on paper for most GPGPU work.

>It was also from where the shit drivers meme comes from: the drivers for VLIW cards had to translate the DirectX shader calls in a way so they best utilize all the VLIW cores (instead of just every fifth fully capable core).

all GPU architectures need to translate api calls and shader code to the GPU's ISA, there's nothing specific about that to VLIW architectures.

>That was the entire reason why they switched away from VLIW and to GCN, in order to produce stronger cores which are more useful in GPGPU tasks and therefore have a significantly bigger market.

ironically, it lost them the market entirely. the only GPGPU application AMD ever had a foothold in was bitcoin mining, which was quickly replaced by purpose built FPGAs and ASICs.

do you watch many of such events? they all are like that, some even worse

>61c full load
>not housefire

This is also on air amigo.

It would be physically impossible for polaris to suck, it might run hot, but it's going to have much higher clock speeds than before, and given AMD's superior IPC? for lack of a better word compared to maxwell, they'll be fine.

>trying to reason with a nvidiot
>ever

Except that we already know it sucks. It isn't competing with NVIDIA on the performance front at all.

now run furmark for at least 10 minutes and post results

>being this deluted

It's not supposed to, it's probably going to top out at $300 and give a bit more performance than a 390.

AMD has review blacklist. Nvidia has review blacklist. Those are true statements. However that doesn't mean review sites don't have blacklists either. Kyle has anti-AMD bias. Everyone was him being a child with the whole Fury/Nano shit. The emotional unprofessional articles he wrote in response was unnecessary, nor the back and forth banter between AMD PR guy and him.

Also, you can still do professional articles even if you have anti-AMD bias. This article is pure rumor meant to create FUD just before the launch of AMD's new GPU and CPU. If the AMD Polaris is indeed trash, we'll know in coming weeks with proper benchmarks, until then, "Polaris is trash and runs hotter, slower and uses more power than nvidia" is useless without any benchmark.

Deluded? We've already seen benchmarks. The polaris 10 cards are slower than even the cut down last gen cards like the 980 and the fury non-x.

>durr why isn't a 220mm2 chip competing with a 320mm2 chip

Maybe because one was made to replace midranged and mobile last gen chips while the other was meant to replace highend last gen chips.
AMD's high end is with Vega in October.
And Nvidia's midend is with GP106 in September.
No one has a full lineup, both aimed at different markets.

>AMD is already a joke for GPGPU tasks

Yeah, and that came from VLIW and how they changed driver kernels every fucking week. I was following bruteforcers a lot during the days, the 5k and 6k cards had insane performance in that regard but the applications had to be recompiled basically every driver release.

>all GPU architectures need to translate api calls and shader code to the GPU's ISA, there's nothing specific about that to VLIW architectures.

The difference is that it was significantly easier to translate the calls to Nvidia cores than it was for VLIW, due to the asymmetrical nature of those shaders.

Basically Nvidia shaders were closer to being general purpose cores while VLIW shaders were glorified DSP vector processor clusters set up to do graphics tasks.

This resulted in the drivers being way more crucial for things running fine, and more often caused things to break.

>ironically, it lost them the market entirely. the only GPGPU application AMD ever had a foothold in was bitcoin mining, which was quickly replaced by purpose built FPGAs and ASICs.

They only lost that market because of cryptocurrency mining becoming mainstream, which meant that people with actual hardware know-how started putting serious financial interest in it.

But to their credit, they also MADE that entire market popular due to how ridiculously fast it was to mine on Radeon cards.