PDA

View Full Version : AMD buy ATI



tibilicus
07-24-2006, 14:09
http://www.pcw.co.uk/personal-computer-world/news/2160898/amd-buys-ati-4b-deal


Well what do you think?

Mikeus Caesar
07-24-2006, 14:16
Bwahahah, the best of both worlds.

AMD and ATI both roxxorz my soxxorz, so this can only be good.

DukeofSerbia
07-24-2006, 17:12
I doubled posted.

DukeofSerbia
07-24-2006, 17:12
We shall see what will future bring. I like both AMD and ATI (I have AMD processor and ATI graphics card).

Official ATI home page:http://www.ati.com/companyinfo/about/amd-ati.html

orangat
07-24-2006, 19:30
Its not the most fortuitous time for a merger with AMD having issues. Hopefully ATI hits the ground running and churning some good stuff.
Intel's partnership with VIA does make more sense in light of the merger.

x-dANGEr
07-24-2006, 21:32
Surprising.

Blodrast
07-24-2006, 23:22
Personally, I'm not overly enthusiastic with such mergers. The fewer giants we have, driving towards monopolies, the better. At this pace, in 10 years we'll only have MicroIntel and IBSun or so. Not so great for the consumer, ya' know.

But I guess in the short term, this might be good for the consumer.

Big_John
07-24-2006, 23:38
i don't know if this will make much short term difference besides intel (probably) dropping crossfire support from their chipsets. the analysis i've read seems to indicate that this merger is all about the future (5-10 year outlook) of both companies, in respect to the coming obsolescence of gpus and the coming energence of mini-core processing.

orangat
07-25-2006, 02:34
i don't know if this will make much short term difference besides intel (probably) dropping crossfire support from their chipsets. the analysis i've read seems to indicate that this merger is all about the future (5-10 year outlook) of both companies, in respect to the coming obsolescence of gpus and the coming energence of mini-core processing.

Obsolescence of gpu's? How/why is that? If anything gpu's are even more important today. And do you mean multi-core or mini-core.

Big_John
07-25-2006, 02:56
Obsolescence of gpu's? How/why is that? If anything gpu's are even more important today. And do you mean multi-core or mini-core.here's the kind of analysis i'm reading:
http://www.theinquirer.net/default.aspx?article=33219

edit: some snipets

Let's look at this long term, say five or so years, the design cycle of a modern CPU. As we've noted earlier, the X86 CPU is about to take a radical turn, and the designs you will see at the turn of the decade won't resemble anything you see now. What do we mean by that? Mini-cores and Larrabee.

[...]

Kevet and Keifer were a mini-core and a CPU made of 32 of those cores respectively aimed at server workloads. It was four times what Niagara was reaching for, but also five years later. Intel is going for the swarm of CPUs on a slab approach to high performance CPUs, and more importantly, is going to upgrade the chips on a much swifter cycle than we've been used to.

With 32 small and simple cores, you can design each core much more quickly than a normal CPU, much more quickly. Design complexity, verification and other headaches make things almost a geometrically increasing design problem. A small core cut and pasted 32 times can mean smaller teams doing more real work instead of busy work, and more teams tweaking things for niches.

[...]

Now, if you add in GPU functionality to the cores, not a GPU on the die, but integrated into the x86 pipeline, you have something that can, on a command, eat a GPU for lunch. A very smart game developer told me that with one quarter of the raw power, a CPU can do the same real work as a GPU due to a variety of effects, memory scatter-gather being near the top of that list. The take home message is that a GPU is the king of graphics in todays world, but with the hard left turn Sun and Intel are taking, it will be the third nipple of the chip industry in no time.

Basically, GPUs are a dead end, and Intel is going to ram that home very soon. AMD knows this, ATI knows this, and most likely Nvidia knows this. AMD has to compete, if it doesn't, Intel will leave it in the dust, and the company will die. AMD can develop the talent internally to make that GPU functionality, hunt down all the patents, licensing, and all the minutia, and still start out a year behind Intel. That is if all goes perfectly, and the projects are started tomorrow.

The other option is to buy a team of engineers that produce world-class products, are battle tested, and have a track record of producing product on the same yearly beat Intel is aiming for. There are two of these in existence, ATI and Nvidia. Nvidia is too expensive, and has a culture that would mix with AMD like sand and Vaseline. That leaves ATI, undervalued and just as good.

Alexander the Pretty Good
07-25-2006, 03:08
Interesting. No support between Intel and ATI = :bigcry: and more compatibility crap to worry about.

I wonder what that article is referring to with the "culture" of nVidia. Personally, I like ATI, but what's the big difference in "culture?"

Big_John
07-25-2006, 03:14
Interesting. No support between Intel and ATI = :bigcry: and more compatibility crap to worry about.

I wonder what that article is referring to with the "culture" of nVidia. Personally, I like ATI, but what's the big difference in "culture?"yeah, i don't know any of that 'tech insider' type stuff. he did say something about nvidia "playing power games" with their sli licences, and intel preparing to "kick nvidia in the teeth" over it.. lol?

Lemur
07-25-2006, 05:12
What's strange is that for the last year or two, the platform of choice has been AMD on a Nvidia Nforce motherboard. Heck, that's what I'm typing on now, and this rig is over three years old (hearken back to the days of Nfoce 2, children). AMD and ATI haven't been obvious partners from a gamer's perspective.

And even though Core 2 Duo is a fantastic proc, let's not get too worked up about "AMD's troubles." It's a great company. Just because they're behind right now doesn't mean they'll be in second place a year from now. I have faith that the back-and-forth between Intel and AMD will continue. It's just that AMD fans got complacent, what with Intel having relatively slow and hot desktop procs for the last two years.

The only constant is change.

Papewaio
07-25-2006, 05:58
Just like professional sports teams the most consistent thing is the brand name and colours.

I would say buy your new rig on bang for buck. Do not buy based on past performance. Buy based on what you need and the best value for money for it.

So even if AMD owns ATI unless they bundle some nice combos that hit the quality for price pressure points it really won't matter to the end consumer. If they start making exclusive combos then they might find themselves freezing themselves out of the market. There is always an exception to the rule, but since Apple has already cornered that side of the market... I hope for our sakes and AMDs that it stays as compatible as possible with all combinations out there.

Old timers do you remember when a certain chip set was limited to a certain expensive RAM... that went down like a lead balloon. :oops: :laugh4:

hoom
07-25-2006, 09:56
I see it as a good thing, this is about integrating high performance ATI techs with the Hyper Transport & other infrastructure that AMD has been working on.
Far from the CPU absorbing the GPU, its more like the GPU gobbling up the CPU.

CPUs are heading parallel.
GPUs are bigger, already highly parallel, do much more per clock & are becoming increasingly general in processing capability.

Alexander the Pretty Good
07-26-2006, 01:22
Old timers do you remember when a certain chip set was limited to a certain expensive RAM... that went down like a lead balloon.
Pre-Intel Macs? :book:

Lemur
07-26-2006, 04:39
ATPG, good guess, but I think he's referring to the Intel/Rambus fiasco, and the ill-begotten i820 motherboard. Made for a bad year 2000 for Intel. I'd say the chipset went over like a lead balloon. Filled with flesh-eating acid. Wrapped in donkey entrails.

Details. (http://zquake.frag.ru/vansmiths/intel.htm)

Catching the Rambus to Nowhere

The planning pipeline at Intel is deep. Several years ago Intel executives decided that the "Camino" chipset (think of the CPU as the brain of a computer, with the chipset serving as the computer's heart) would use Rambus's RDRAM (Rambus Direct Random Access Memory), a new type of RAM first adopted by Nintendo for use in the Ninetendo64 (but tellingly dropped from Ninetendo's forthcoming "Dolphin"). RDRAM's chief advantage is high bandwidth. What this means is that RDRAM can transport a large amount of data quickly. Unfortunately, this performance advantage is largely offset by high latencies -- it takes a long time for the data to start flowing after a request is made. Additionally, the slow bus speeds used on Intel's PIII processor serve as a bottleneck constricting bandwidth and further degrading performance. The net effect of these combined factors is slightly worse times on most benchmarks compared with similarly clocked SDRAM systems (in fairness, RDRAM does seem to help in benchmarks involving large streams of data such as photo processing).

As the Camino, now loquaciously named "i820," neared completion, other problems cropped up. RDRAM was still (and remains) outrageously expensive -- three to five times more expensive than SDRAM. More problematic, Intel discovered that i820 motherboards equipped with three RDRAM modules were unstable. Feeling heat from AMD, Intel scrambled to fix this bug, but could not and was forced to cancel the i820's launch at the last minute. As embarrassing as this was for Intel, the chip giant furthered its shame. The i820 motherboard bug turned out to be the result of bad engineering and was physically impossible to fix, however instead of scrapping and redesigning the motherboard Intel capped the third RDRAM slot so that it could not be used. This is like a car manufacturer releasing a car with three wheels after discovering that the fourth wheel kept falling off.

Even before the i820's launch was initially called off, Micron, the finest maker of RAM products in the world, had already announced that it would forgo RDRAM and instead implement systems around improved, faster and cheaper SDRAM. Since then, companies initially friendly to RDRAM, such as Hyundai, are starting to talk about this memory technology as if it is already dead.

If anything, the news for RDRAM has gotten even gloomier lately. Citing information obtained from a Rambus distributor, a recent Register article states that yields of RDRAM have dropped to below ten per cent. Such low yields force RDRAM prices into the stratosphere and these high prices are causing many motherboard manufacturers to abandon plans to support this failing technology.

From all current appearances, Rambus's RDRAM is in its deathbed. Intel invested very heavily in both the Rambus company and the copious Rambus technology in Intel's roadmaps. Rambus's failure will cost Intel a lot of money, but it can absorb that loss. What is more damaging is that it is costing Intel time and reputation to revamp its roadmaps.

BDC
07-26-2006, 07:06
In light of cpus picking up gpu functionality, AMD's decision looks very sensible.

I'm still rather doubtful about integrated systems. Notice how many games work with onboard graphics? I appreciate it will be a little different, but still...

hoom
07-26-2006, 07:34
Well all the xbox 360 games for a start and I think thats the sort of integrated systems that this is aimed at rather than the old traditional bare minimum to run XP gui that has been previously proferred mainly by Intel.

AMDs focus for a long time has been about open access platforms & most recently they have been making quite a bit of noise about having powerful co-processors with direct coherent HT links to the CPU (Torrenza).
A prime candidate for that sort of thing seems likely to be a big powerful GPU and thats a very good thing :2thumbsup:

Lemur
07-26-2006, 18:56
Looks like it has come to pass. (http://www.zdnetindia.com/news/hardware/stories/150634.html)


AMD acquires ATI and drops prices

First, AMD has announced that it is to buy ATI in a deal that will cost them $5.4 billion ($4.2 billion cash, $1.2 billion stock).This confirmation comes after weeks of rumor and speculation that the two companies were to merge.

What does this mean for AMD?This deal give them access to the growing cellphone and handheld devices market, a market that's currently far more buoyant than the PC market.This announcement might also mean that as a consequence NVIDIA will pursue the Intel market more vigorously and maybe even cut back on AMD components.

Also, AMD has announced price cuts for many CPUs. Notably, the Athlon 64 FX-62 is down to $827 and the Athlon 64 X2 5000+ down to ?310.At the same time AMD announced that the Athlon 64 X2 4400+ and 4800+ for both Socket 939 and AM2 have been discontinued.This price cut still leaves the FX-62 vastly overpriced, while bringing the price of the X2 5000+ to something more comparable to the Intel E6600 Core 2 Duo. The main advantage that AMD now has is that the prices of motherboards for their CPUs are going to be far lower than those for Core 2 Duo CPUs, offering AMD CPUs a temporary advantage over Intel's Core 2 Duo.

It's pretty clear that what AMD are hoping to achieve with these price cuts it to distract user attention away from power and concentrate instead on price.That is, apart from the FX-62 - I'm really not sure who AMD hopes will buy these insanely overpriced processors.It would have taken a far more ruthless price cut to bring this CPU down to a competitive level and it seems that AMD have not been able to juggle the numbers in such a way that would allow this to happen.My prediction is that AMD will have to a find way to cut the price of the FX-62 by maybe up to 50% again to make it competitive, or drop it from the line-up altogether.