Upload
Login or register

ilcecchino

Last status update:
-
Date Signed Up:10/08/2011
Last Login:7/27/2016
FunnyJunk Career Stats
Comment Ranking:#5738
Highest Content Rank:#8228
Highest Comment Rank:#4166
Content Thumbs: 11 total,  23 ,  12
Comment Thumbs: 1704 total,  3013 ,  1309
Content Level Progress: 23.72% (14/59)
Level 0 Content: Untouched account → Level 1 Content: New Here
Comment Level Progress: 52% (52/100)
Level 215 Comments: Comedic Genius → Level 216 Comments: Comedic Genius
Subscribers:1
Content Views:3808
Total Comments Made:1749
FJ Points:1603

latest user's comments

#24 - i never dated anyone in highschool and i honestly feel like i …  [+] (1 new reply) 06/07/2015 on Kim Possible 0
User avatar
#25 - advice (06/08/2015) [-]
depending on how big your dick is, it could've worked man, ugly but good dick usually works in high school
#68 - amd has always had a price/performance factor over competitors…  [+] (1 new reply) 06/07/2015 on (untitled) 0
User avatar
#83 - sniffythebird (06/07/2015) [-]
Not always, and definitely not in every aspect if you know what to look at. They had a competitive edge mostly around 2009-2011. Ever since Nvidia's Kepler architecture has been out, their GPUs have been way more efficient, and usually held the #1 spot for raw performance in top end cards aswell.

Yes, AMD still has a few price advantages here and there like the 290X having slightly more performance per dollar than a 970, but that is ONE factor out of many when it comes to determining how good OVERALL a graphics card is. The high end R200 series cards are notoriously inefficient, hot, and loud. So if you're someone who is debating between a GTX 970 or R9 290X, you can't just look at price and raw performance without considering the TDP.

"B-but... but muh power bill will only be a few cents moar, it doeznt matter!" I've heard a lot of people say.
Bullshit.
It's not just that, they'd need a better power supply, or even if they have enough watts, it'll be under much heavier load and drastically shorten the lifespan. And that goes for the cards too. Even if AMD claims they're "designed" to run at 95-105C (lol okay), the laws of physics still apply. Electrical circuits deteroriate much faster at those temperatures, it just isn't good. And it doesn't do much good for stability either. I've talked to several people who have owned R9 290 / 290X cards, one of them had a 1200W PSU which still didn't cut it for dual 290X'es, and the PSU even blew up. Another guy (coworker) had an R9 290 that melted the plastic/rubber on his HDMI cable that was plugged into it.

The gist of it is, in the past 3 years or so, Nvidia has focused more increasing power/thermal efficiency, while AMD has stuck with old technology for too long and pushed it to it's breaking point. Hence why some of their cards have a bang for buck edge, but at a huge cost of efficiency, life span, temperature/noise, realibility. In my opinion, those are simply not worth the little money saved when buying a current AMD card. Especially not in the long run.

I haven't seen any 300 series benchmarks, and as far as I know (could be wrong) there's no benchmarks yet either, so there's really nothing to go on until they release or some proper reviewers get their hands on some and do comparison benchmarks. Out of pure speculation, I highly doubt a 390X would beat a Titan X or 980 Ti in performance if the price point is half as much, pretty much equal to a 980. We'll just have to see. Hey, it could be better. We don't know yet. Same goes for their HBM (High Bandwidth Memory). If it's predicted to be superior, it doesn't IN FACT work yet. And I think you're confusing DDR with GDDR. Those are entirely different things.

#66 - Advid AMD fanboy here. and you basically hit the nail on the head.  [+] (12 new replies) 06/07/2015 on (untitled) 0
User avatar
#91 - kibuza (06/07/2015) [-]
You mean they made it using Nvidia's source code, and Nvidia forbids people from sharing their intellectual and physical property with a competing company? Good god, those assholes.

They should be more like every other business. Every other business definitely shares equal coding / resources / property. I mean look at Samsung and Apple, they definitely don't try to one-up the competition by making superior products and not sharing specs and codes.
User avatar
#111 - ilcecchino (06/08/2015) [-]
except. blackmailing, threatening, bribing among other thigns. that's not competative. that's not "may the best product win" that's underhanded and dirty shit that they have been doing for years to achieve the global market over amd. intel has done the same and there has even been a lawsuit over it. before intel took over the global market for reasons mentioned. amd was far and above everyone else
User avatar
#112 - kibuza (06/08/2015) [-]
Got any evidence or are you just shitting out of your mouth?

Also, the fact that there was a lawsuit means nothing. What was the verdict on this lawsuit? You're just depressed that a company worth about 3B and spending less and less on R&D each year is somehow losing to a company worth 12B+ which is spending more on R&D each year.

In summary, you are a fucking idiot. Feel free to continue buying your inferior products.
User avatar
#113 - ilcecchino (06/08/2015) [-]
losing? lol no. im not depressed. im impressed. amd is keeping up with that handicap. that shows a company that knows what the fuck they are doing

en.wikipedia.org/wiki/Advanced_Micro_Devices,_Inc._v._Intel_Corp.

just one instance against intel. there are more cases that never made it to court.

www.bit-tech.net/news/hardware/2008/12/11/amd-exec-says-physx-will-die/1
only skimmed this article but it looks like the one i read awhile ago, saying how physx is basically just a scam for people to want to buy nvidia products and that cpu based physics are much more efficient and better.

along side the many other unessecary things nvidia puts into their cards that can be done in a universal way. just so that they can bribe companies into optimising for thier cards so they only seem superior. when infact, they are not. atleast that was the case before they got such a huge lead that they can just shit money out untill something works
User avatar
#114 - kibuza (06/08/2015) [-]
>Company value steadily decreasing year after year
>Competitors value steadily increasing year after year
>Not losing

Nigga you understand how businesses work right?

AMD vs Intel is not AMD vs Nvidia. Nice job pulling that gem out of your ass. Also, accusations which never make it to court are fucking worthless. "ILCECCHINO rapes dolphins". I guess it must be true because it's been said but never proven in court.

Also, if the CPU run PhysX and GameWorks can be run so much nicer on a CPU why would developers pay for a license to use them and limit their customer base.

Use your fucking head.
User avatar
#115 - ilcecchino (06/08/2015) [-]
We literally just said this. nvidia will literally pay companies to optimize games for their cards. that includes physx and gameworks. and grats? i know amd vs intel is not amd vs nvidia. im stating points and facts about amd. amd lost the global market to intel while they were just processors back in like 05. ati was losing out to nvidia and gets picked up by amd. they create cards and chips. but because they had lost the global market already. intel and nvidia already had their names out. i've asked a shitload of my friends who are kinda iffy on computer products. cause it's not their hobby. ALL of them knew intel's name. none even know AMD existed. they were iffy on graphics cards nvidia only came up because some of their games have their logo on the start. it's like going to the market and looking for soup. you know cambells chicken noodle soup(intel/nvidia). you've seen a commercial. you buy it. even though there are other brands(amd/ati) that you never heard of that could possibly have superior soup. that's basically what happened to amd for years and years to the point where we are now, where sure. they're making slightly better products. but with as much of a lead as they have in terms of money, amd shouldn't even be able to hold as much as a candle to them. yet they can, and do. proudly. amd is a company that has always been about the product, the games. the tech. nvidia is a company that only cares about milking their consumer base for as much money they can. much like apple. we all know apple does. that doesnt mean they make bad stuff. but their goal is the money, not to make the best product they can. if AMD had the budget that nvidia had. the gaming world would be in a much, much better place. and for that reason i will support amd till the day they claim bankruptcy, knock on wood. i've been with AMD since i learned to build a computer when i was 10. never had any problems and only have great things to say about their products.
User avatar
#116 - kibuza (06/08/2015) [-]
Hit the enter key once in awhile. Look at that block of bullshit.

Man your points are so easy to disprove, conversation was about nvidia vs amd until you brought up Intel our of nowhere.

Developers buy the license from Nvidia to use Gameworks and other features. Still waiting on some proof that Nvidia pays companies to use their tech, or did you conveniently forget to post proof?

>Nvidia not caring about technology and just wanting money
>Nvidia spending more on R&D developing NEW technology
>AMD using the same shit for the last 5 years.

Yea, it's Nvidia milking it for money.

Also your last sentence and it pretty much makes your opinion on AMD vs Nvidia worthless. You've only used AMD. You have no reference data.

I currently have a 6gb dual FTW Edition 7990. And not only does Xfire not work with nearly every game in existence, it barely runs better than my old GTX680.

Next card I get will be an Nvidia, and maybe AMD will turn their shit around but as it stands now they are inferior to Nvidia cards. It's just a fact.
User avatar
#117 - ilcecchino (06/08/2015) [-]
i said i've been with. not that I've specifically only used. how can you side with nvidia if you've never used an amd chip/card. oh, you have? look at that. you used both sides and you're still with nvidia.

i have a 7970. fx-8350 black edition. only this year have i started running into gpu bottlenecking and i will be getting an r9 390x as soon as it releases. my 8350 is running perfectly fine. i have friends who are nvidia fans and even they say that SLI causes more problems than it mitigates. it's just not something developers really put time behind because there's not too many people who have dual gpu rigs. nor is it really needed.

and what "new" technologies. PhysX when we've had havok for the longest time? Gameworks? which as we see with UE4 can all be done inside the engines. Nvidia hairworks. which is basically just TressFx

wccftech.com/tressfx-hair-20-detailed-improved-visuals-performance-multiplatform-support/

which is not only faster than hairwoks. AMD makes it available to everyone (seeing as it runs faster, atleast in this example on nvidia cards than even hairworks)
User avatar
#118 - kibuza (06/08/2015) [-]
Bro, you obviously don't fact check your shit before it dribbles out of your mouth. Gameworks was in use before UE4 was even released you idiot.

Yea, Havok is definitely as good as PhysX. That's why we see it less and less and more companies opt for PhysX.

SLI is usually more hassle than it's worth, you're right. But AMD Xfire is fucking useless. 80% of games I have to go into AMD control panel and disable Xfire.

My friend (since apparently we can bring our friends in now) runs SLI 680s, and they are amazing. Even when they don't work to full capacity at least he doesn't need to go in to a separate control panel and manually disable them.
User avatar
#119 - ilcecchino (06/08/2015) [-]
no shit gameworks was in use before UE4. all it was an example. everything gameworks did could be done in-engine. it's nothing new.

Havok is just as good. i've never played a game where i was like "oh my god these physics" while i had my nvidia card. the literal only game where i've even seen physx be remotely visible is borderlands. and even then it's just the grenades, flags, and banners that i noticed a difference in
User avatar
#120 - kibuza (06/08/2015) [-]
>Everything could have be done in Engine.
>Engine which uses same effects created in late 2014

Yup, it's old tech.

Your arguments pretty weak bro.

Also you just admitted that PhysX was better, even if in a small portion.
User avatar
#84 - sniffythebird (06/07/2015) [-]
Really, it's not optimization so that game performance is biased. This is what a lot of people think, and from that I can safely say they haven't used a high end Nvidia card with games like that cranked all the way up.

Proprietary shit like Gameworks just butchers performance when used, with little more than a few gimmicky graphical effects. Stuff like HBAO+ and TXAA look like crap and tank framerates to a ridiculous level, it's just not worth using. Far Cry 4 for example was praised to the skies by Nvidia, yet it runs like shit on dual 980's.

But it goes both ways. Thief and Battlefield 4 were supposed to have vastly superior performance with Mantle; and they partnered with AMD, except it's actually worse than DX11 in most cases.
#111 - i ment "current" as relative. as in "current to…  [+] (1 new reply) 06/07/2015 on trigger warning +5
User avatar
#112 - scowler (06/07/2015) [-]
Oooooohhhhh.

Sorry.
#101 - you asked what man. i told you. you not having a girlfriend do…  [+] (3 new replies) 06/07/2015 on trigger warning +6
User avatar
#102 - scowler (06/07/2015) [-]
You said you did, now you don't.

Oh, and that was a typo.
User avatar
#111 - ilcecchino (06/07/2015) [-]
i ment "current" as relative. as in "current to the time" not "Current" meaning the one i have currently. i don't have one currently. meaning i dont want to have sex with anyone atm.
User avatar
#112 - scowler (06/07/2015) [-]
Oooooohhhhh.

Sorry.
#98 - alot. i forone wouldn't. i only like having sex with my curren…  [+] (7 new replies) 06/07/2015 on trigger warning +4
User avatar
#99 - scowler (06/07/2015) [-]
Well I don't have a girlfriends so I wouldn't mind, unless she looks like she might have VD...
#198 - whataphreek (06/07/2015) [-]
That explains a lot
User avatar
#207 - scowler (06/07/2015) [-]
It's not iike engineering or anything...
User avatar
#101 - ilcecchino (06/07/2015) [-]
you asked what man. i told you. you not having a girlfriend doesn't speak for all of us

and i don't have one atm either. still wouldn't want to
User avatar
#102 - scowler (06/07/2015) [-]
You said you did, now you don't.

Oh, and that was a typo.
User avatar
#111 - ilcecchino (06/07/2015) [-]
i ment "current" as relative. as in "current to the time" not "Current" meaning the one i have currently. i don't have one currently. meaning i dont want to have sex with anyone atm.
User avatar
#112 - scowler (06/07/2015) [-]
Oooooohhhhh.

Sorry.
#31 - not that little dangly thing get rekt frank 06/06/2015 on The little dangly thing +7
#107 - dude got rekt at the end 06/06/2015 on BRAKE 0
#184 - YOUR DREAMS, JUST DO IT, YOUR DREAMS, JUST DO IT, DO IT, YOUR … 06/06/2015 on A new Shia every 3 seconds +6
#99 - what's she look like now? she has alot of potential if she di…  [+] (2 new replies) 06/02/2015 on Bitch vs Meninist 0
#124 - imashitbricks (06/02/2015) [-]
It seems her twitter is now under new management. I found the tweet in question and it showed her twitter handle to be @torialyp ( twitter.com/torialyp ) but it's no longer hers or she deleted everything on it. Sorry, but this is as much digging as I feel like doing currently. If I feel like when I get back on later I'll try to find her new stuff.
#228 - buffygifs (06/03/2015) [-]
She deleted her account as a direct result of this.