Rank #5738 on CommentsLevel 215 Comments: Comedic Genius
OfflineSend mail to ilcecchino Block ilcecchino Invite ilcecchino to be your friend
|Last status update:|| |
|Date Signed Up:||10/08/2011|
|FunnyJunk Career Stats|
|Highest Content Rank:||#8228|
|Highest Comment Rank:||#4166|
|Content Thumbs:||11 total, 23 , 12|
|Comment Thumbs:||1704 total, 3013 , 1309|
|Content Level Progress:|| 23.72% (14/59) |
Level 0 Content: Untouched account → Level 1 Content: New Here
|Comment Level Progress:|| 52% (52/100) |
Level 215 Comments: Comedic Genius → Level 216 Comments: Comedic Genius
|Total Comments Made:||1749|
latest user's comments
|#24 - i never dated anyone in highschool and i honestly feel like i … [+] (1 new reply)||06/07/2015 on Kim Possible||0|
|#68 - amd has always had a price/performance factor over competitors… [+] (1 new reply)||06/07/2015 on (untitled)||0|
#83 - sniffythebird (06/07/2015) [-]
Not always, and definitely not in every aspect if you know what to look at. They had a competitive edge mostly around 2009-2011. Ever since Nvidia's Kepler architecture has been out, their GPUs have been way more efficient, and usually held the #1 spot for raw performance in top end cards aswell.
Yes, AMD still has a few price advantages here and there like the 290X having slightly more performance per dollar than a 970, but that is ONE factor out of many when it comes to determining how good OVERALL a graphics card is. The high end R200 series cards are notoriously inefficient, hot, and loud. So if you're someone who is debating between a GTX 970 or R9 290X, you can't just look at price and raw performance without considering the TDP.
"B-but... but muh power bill will only be a few cents moar, it doeznt matter!" I've heard a lot of people say.
It's not just that, they'd need a better power supply, or even if they have enough watts, it'll be under much heavier load and drastically shorten the lifespan. And that goes for the cards too. Even if AMD claims they're "designed" to run at 95-105C (lol okay), the laws of physics still apply. Electrical circuits deteroriate much faster at those temperatures, it just isn't good. And it doesn't do much good for stability either. I've talked to several people who have owned R9 290 / 290X cards, one of them had a 1200W PSU which still didn't cut it for dual 290X'es, and the PSU even blew up. Another guy (coworker) had an R9 290 that melted the plastic/rubber on his HDMI cable that was plugged into it.
The gist of it is, in the past 3 years or so, Nvidia has focused more increasing power/thermal efficiency, while AMD has stuck with old technology for too long and pushed it to it's breaking point. Hence why some of their cards have a bang for buck edge, but at a huge cost of efficiency, life span, temperature/noise, realibility. In my opinion, those are simply not worth the little money saved when buying a current AMD card. Especially not in the long run.
I haven't seen any 300 series benchmarks, and as far as I know (could be wrong) there's no benchmarks yet either, so there's really nothing to go on until they release or some proper reviewers get their hands on some and do comparison benchmarks. Out of pure speculation, I highly doubt a 390X would beat a Titan X or 980 Ti in performance if the price point is half as much, pretty much equal to a 980. We'll just have to see. Hey, it could be better. We don't know yet. Same goes for their HBM (High Bandwidth Memory). If it's predicted to be superior, it doesn't IN FACT work yet. And I think you're confusing DDR with GDDR. Those are entirely different things.
|#66 - Advid AMD fanboy here. and you basically hit the nail on the head. [+] (12 new replies)||06/07/2015 on (untitled)||0|
#91 - kibuza (06/07/2015) [-]
You mean they made it using Nvidia's source code, and Nvidia forbids people from sharing their intellectual and physical property with a competing company? Good god, those assholes.
They should be more like every other business. Every other business definitely shares equal coding / resources / property. I mean look at Samsung and Apple, they definitely don't try to one-up the competition by making superior products and not sharing specs and codes.
#111 - ilcecchino (06/08/2015) [-]
except. blackmailing, threatening, bribing among other thigns. that's not competative. that's not "may the best product win" that's underhanded and dirty shit that they have been doing for years to achieve the global market over amd. intel has done the same and there has even been a lawsuit over it. before intel took over the global market for reasons mentioned. amd was far and above everyone else
#112 - kibuza (06/08/2015) [-]
Got any evidence or are you just shitting out of your mouth?
Also, the fact that there was a lawsuit means nothing. What was the verdict on this lawsuit? You're just depressed that a company worth about 3B and spending less and less on R&D each year is somehow losing to a company worth 12B+ which is spending more on R&D each year.
In summary, you are a fucking idiot. Feel free to continue buying your inferior products.
#113 - ilcecchino (06/08/2015) [-]
losing? lol no. im not depressed. im impressed. amd is keeping up with that handicap. that shows a company that knows what the fuck they are doing
just one instance against intel. there are more cases that never made it to court.
only skimmed this article but it looks like the one i read awhile ago, saying how physx is basically just a scam for people to want to buy nvidia products and that cpu based physics are much more efficient and better.
along side the many other unessecary things nvidia puts into their cards that can be done in a universal way. just so that they can bribe companies into optimising for thier cards so they only seem superior. when infact, they are not. atleast that was the case before they got such a huge lead that they can just shit money out untill something works
#114 - kibuza (06/08/2015) [-]
>Company value steadily decreasing year after year
>Competitors value steadily increasing year after year
Nigga you understand how businesses work right?
AMD vs Intel is not AMD vs Nvidia. Nice job pulling that gem out of your ass. Also, accusations which never make it to court are fucking worthless. "ILCECCHINO rapes dolphins". I guess it must be true because it's been said but never proven in court.
Also, if the CPU run PhysX and GameWorks can be run so much nicer on a CPU why would developers pay for a license to use them and limit their customer base.
Use your fucking head.
#115 - ilcecchino (06/08/2015) [-]
We literally just said this. nvidia will literally pay companies to optimize games for their cards. that includes physx and gameworks. and grats? i know amd vs intel is not amd vs nvidia. im stating points and facts about amd. amd lost the global market to intel while they were just processors back in like 05. ati was losing out to nvidia and gets picked up by amd. they create cards and chips. but because they had lost the global market already. intel and nvidia already had their names out. i've asked a shitload of my friends who are kinda iffy on computer products. cause it's not their hobby. ALL of them knew intel's name. none even know AMD existed. they were iffy on graphics cards nvidia only came up because some of their games have their logo on the start. it's like going to the market and looking for soup. you know cambells chicken noodle soup(intel/nvidia). you've seen a commercial. you buy it. even though there are other brands(amd/ati) that you never heard of that could possibly have superior soup. that's basically what happened to amd for years and years to the point where we are now, where sure. they're making slightly better products. but with as much of a lead as they have in terms of money, amd shouldn't even be able to hold as much as a candle to them. yet they can, and do. proudly. amd is a company that has always been about the product, the games. the tech. nvidia is a company that only cares about milking their consumer base for as much money they can. much like apple. we all know apple does. that doesnt mean they make bad stuff. but their goal is the money, not to make the best product they can. if AMD had the budget that nvidia had. the gaming world would be in a much, much better place. and for that reason i will support amd till the day they claim bankruptcy, knock on wood. i've been with AMD since i learned to build a computer when i was 10. never had any problems and only have great things to say about their products.
#116 - kibuza (06/08/2015) [-]
Hit the enter key once in awhile. Look at that block of bullshit.
Man your points are so easy to disprove, conversation was about nvidia vs amd until you brought up Intel our of nowhere.
Developers buy the license from Nvidia to use Gameworks and other features. Still waiting on some proof that Nvidia pays companies to use their tech, or did you conveniently forget to post proof?
>Nvidia not caring about technology and just wanting money
>Nvidia spending more on R&D developing NEW technology
>AMD using the same shit for the last 5 years.
Yea, it's Nvidia milking it for money.
Also your last sentence and it pretty much makes your opinion on AMD vs Nvidia worthless. You've only used AMD. You have no reference data.
I currently have a 6gb dual FTW Edition 7990. And not only does Xfire not work with nearly every game in existence, it barely runs better than my old GTX680.
Next card I get will be an Nvidia, and maybe AMD will turn their shit around but as it stands now they are inferior to Nvidia cards. It's just a fact.
#117 - ilcecchino (06/08/2015) [-]
i said i've been with. not that I've specifically only used. how can you side with nvidia if you've never used an amd chip/card. oh, you have? look at that. you used both sides and you're still with nvidia.
i have a 7970. fx-8350 black edition. only this year have i started running into gpu bottlenecking and i will be getting an r9 390x as soon as it releases. my 8350 is running perfectly fine. i have friends who are nvidia fans and even they say that SLI causes more problems than it mitigates. it's just not something developers really put time behind because there's not too many people who have dual gpu rigs. nor is it really needed.
and what "new" technologies. PhysX when we've had havok for the longest time? Gameworks? which as we see with UE4 can all be done inside the engines. Nvidia hairworks. which is basically just TressFx
which is not only faster than hairwoks. AMD makes it available to everyone (seeing as it runs faster, atleast in this example on nvidia cards than even hairworks)
#118 - kibuza (06/08/2015) [-]
Bro, you obviously don't fact check your shit before it dribbles out of your mouth. Gameworks was in use before UE4 was even released you idiot.
Yea, Havok is definitely as good as PhysX. That's why we see it less and less and more companies opt for PhysX.
SLI is usually more hassle than it's worth, you're right. But AMD Xfire is fucking useless. 80% of games I have to go into AMD control panel and disable Xfire.
My friend (since apparently we can bring our friends in now) runs SLI 680s, and they are amazing. Even when they don't work to full capacity at least he doesn't need to go in to a separate control panel and manually disable them.
#119 - ilcecchino (06/08/2015) [-]
no shit gameworks was in use before UE4. all it was an example. everything gameworks did could be done in-engine. it's nothing new.
Havok is just as good. i've never played a game where i was like "oh my god these physics" while i had my nvidia card. the literal only game where i've even seen physx be remotely visible is borderlands. and even then it's just the grenades, flags, and banners that i noticed a difference in
#84 - sniffythebird (06/07/2015) [-]
Really, it's not optimization so that game performance is biased. This is what a lot of people think, and from that I can safely say they haven't used a high end Nvidia card with games like that cranked all the way up.
Proprietary shit like Gameworks just butchers performance when used, with little more than a few gimmicky graphical effects. Stuff like HBAO+ and TXAA look like crap and tank framerates to a ridiculous level, it's just not worth using. Far Cry 4 for example was praised to the skies by Nvidia, yet it runs like shit on dual 980's.
But it goes both ways. Thief and Battlefield 4 were supposed to have vastly superior performance with Mantle; and they partnered with AMD, except it's actually worse than DX11 in most cases.
|#111 - i ment "current" as relative. as in "current to… [+] (1 new reply)||06/07/2015 on trigger warning||+5|
|#101 - you asked what man. i told you. you not having a girlfriend do… [+] (3 new replies)||06/07/2015 on trigger warning||+6|
|#98 - alot. i forone wouldn't. i only like having sex with my curren… [+] (7 new replies)||06/07/2015 on trigger warning||+4|
|#31 - not that little dangly thing get rekt frank||06/06/2015 on The little dangly thing||+7|
|#107 - dude got rekt at the end||06/06/2015 on BRAKE||0|
|#184 - YOUR DREAMS, JUST DO IT, YOUR DREAMS, JUST DO IT, DO IT, YOUR …||06/06/2015 on A new Shia every 3 seconds||+6|
|#99 - what's she look like now? she has alot of potential if she di… [+] (2 new replies)||06/02/2015 on Bitch vs Meninist||0|
#124 - imashitbricks (06/02/2015) [-]
twitter.com/torialyp ) but it's no longer hers or she deleted everything on it. Sorry, but this is as much digging as I feel like doing currently. If I feel like when I get back on later I'll try to find her new stuff.