CJM
Jul 20, 12:35 PM
You realize there are probably only four people on this board who are old enough to get that joke, right?
My "vote" goes for "Hex" - "The Mac Hex. Buy one and see." Then again, maybe not.
16 here, but I still get it :p
Come on, some Mac fans do a little research now and again :P
My "vote" goes for "Hex" - "The Mac Hex. Buy one and see." Then again, maybe not.
16 here, but I still get it :p
Come on, some Mac fans do a little research now and again :P
nsjoker
Aug 11, 12:13 PM
this phone is going to have to be pretty amazing for me to get one.. I'm talking a full-fledged iPod with capabilities of a great cell phone.. and decently priced. terminating my contract just isn't worth it from an economical point of view.
Mattsasa
Apr 6, 03:01 PM
I hope that number keeps rising; we need competition to not let Apple rest on it's laurels.
apple isn't resting on their laurels,
if that number rises... which it will, it just means less developers and apps for ios
apple isn't resting on their laurels,
if that number rises... which it will, it just means less developers and apps for ios
kdarling
Apr 20, 10:46 AM
http://cultofmac.cultofmaccom.netdna-cdn.com/wordpress/wp-content/uploads/2011/04/Screen-shot-2011-04-19-at-8.37.05-PM.png
feel free to point out how difficult it is to see any similarities...
Choosing icons that have taken on universal meanings and thus are similar, is quite a bit different from direct copying, of which we see none.
The closest ones in that group are probably the phones, and yet if you search for a phone icon on the web, or even on cell phone buttons, probably a quarter of them are slanted. Moreover, green is an extremely common color for the primary phone button, which is why Apple chose it themselves.
The use of rounded square icon backgrounds is a bit more damning, but still a style choice. Also, Apple's has a shadow and my Fascinate doesn't have the rounded square on most anyway.
Btw, I have noticed that Apple hasn't tried to claim ownership of the twirling wait symbol, but a lot of us were using that before they were.
I think Apple might have much better luck showing that the Galaxy phone shape greatly resembles the 3GS.
feel free to point out how difficult it is to see any similarities...
Choosing icons that have taken on universal meanings and thus are similar, is quite a bit different from direct copying, of which we see none.
The closest ones in that group are probably the phones, and yet if you search for a phone icon on the web, or even on cell phone buttons, probably a quarter of them are slanted. Moreover, green is an extremely common color for the primary phone button, which is why Apple chose it themselves.
The use of rounded square icon backgrounds is a bit more damning, but still a style choice. Also, Apple's has a shadow and my Fascinate doesn't have the rounded square on most anyway.
Btw, I have noticed that Apple hasn't tried to claim ownership of the twirling wait symbol, but a lot of us were using that before they were.
I think Apple might have much better luck showing that the Galaxy phone shape greatly resembles the 3GS.
Xenious
Jul 14, 05:27 PM
Dual drive slots are cool, but the design is boring. Don't get me wrong I love my G5 powermac I was just hoping for a new or different design for the next ones...Maybe the same but square or smaller or something. Oh well it doesn't matter I'm still buying. :)
mikewilder23
Jun 21, 03:43 AM
well looking forward for its launching...:)
Erasmus
Jul 21, 11:55 PM
So I read in this thread that Kentsfield and Clovertown ARE compatible with Conroe and Woodcrest sockets (respectively) (Cloverton or Clovertown?)
Hope for upgrading an iMac to Quad Core is kindled! At least if Apple releases Conroe iMacs.
BTW, In my opinion, one thing a person should never, ever say is some computer has too much power, and that it will never be needed. So when 128 core CPUs come out in ~10 years time, will we still be considering dual core CPUs as fast enough for our use?
I seem to remember that when the original DOS operating system was created, its RAM was limited. I can't remember exactly to how much, but it was decided that people would never use more than a few kilobytes of memory. Now we are arguing that Mac should provide no less than a gigabyte! Now we are moving to 64 bit processing, with its capability to address a few exobytes, or millions of Terabytes of storage, it seems impossible that we will ever need 128bit computing. But, no doubt, one day we will.
When we will be able to download our entire lives, and even conciousness into a computer, as is said to happen in about 40 years (very much looking forward to), I dare say it will take a lot of memory to do, and even more processing power to manage effectively, especially if we wanted to "live" inside computers, as we will no doubt want to do someday.
So as a conclusion to my most recent rant, Please, never tell me a computer is too powerfu, has too many cores, or has too much storage capacity. If it is there to be used, it will be used. It always is.
Hope for upgrading an iMac to Quad Core is kindled! At least if Apple releases Conroe iMacs.
BTW, In my opinion, one thing a person should never, ever say is some computer has too much power, and that it will never be needed. So when 128 core CPUs come out in ~10 years time, will we still be considering dual core CPUs as fast enough for our use?
I seem to remember that when the original DOS operating system was created, its RAM was limited. I can't remember exactly to how much, but it was decided that people would never use more than a few kilobytes of memory. Now we are arguing that Mac should provide no less than a gigabyte! Now we are moving to 64 bit processing, with its capability to address a few exobytes, or millions of Terabytes of storage, it seems impossible that we will ever need 128bit computing. But, no doubt, one day we will.
When we will be able to download our entire lives, and even conciousness into a computer, as is said to happen in about 40 years (very much looking forward to), I dare say it will take a lot of memory to do, and even more processing power to manage effectively, especially if we wanted to "live" inside computers, as we will no doubt want to do someday.
So as a conclusion to my most recent rant, Please, never tell me a computer is too powerfu, has too many cores, or has too much storage capacity. If it is there to be used, it will be used. It always is.
eMagius
Aug 8, 07:31 AM
hmmm, most of the features are already in windows? what version of windows do you have?
2003.
2003.
myemosoul
Jun 23, 02:33 PM
I confirmed today that my store will not have any for release day tomorrow, i got the district managers phone number and left a message about wanting my 184 dollar gift card put on my debit card instead due to false advertising that they would be doing Pre-orders instead of reservations which led me to believe that i would get a phone on release day, as a result i am forced to go camp out at the apple store overnight in 91 degree heat in a few hours with my fingers crossed that i get one. All of this could have been avoided if they didn't post on their twitter on June 9th that they were taking part in release day.
aegisdesign
Sep 13, 12:30 PM
The Mac Pro isn't for most people. It's for professionals and professional applications, which are usally multithreaded, and will take advantage of the capabilities.
If you have a complaint about all these cores and not being able to take advantage of them, then this is not the computer for you. You're probably not using the software that will take advantage of them, so let it go and stop whining about it. For the those of us that do, this is great news.
It was just a general point, not a whine, so don't get your panties in a bunch. And some of the applications that don't take advantage of multiple cores currently are Adobe Photoshop and Quicktime which both rarely use more than two cores and sometimes only one. Both pretty important to professionals.
If you have a complaint about all these cores and not being able to take advantage of them, then this is not the computer for you. You're probably not using the software that will take advantage of them, so let it go and stop whining about it. For the those of us that do, this is great news.
It was just a general point, not a whine, so don't get your panties in a bunch. And some of the applications that don't take advantage of multiple cores currently are Adobe Photoshop and Quicktime which both rarely use more than two cores and sometimes only one. Both pretty important to professionals.
jmbear
Nov 29, 12:39 PM
See, that's the catch-22 for new artists. The labels are the ones that get tunes played on the radio. In the 50's and 60's they would strong-arm their stuff in, but I'm sure even nowadays they provide incentives (read: bribes) to get new stuff on the air. Especially if they think the band is really good and will make it in the long run. And don't fool yourself into thinking a new band can get huge without radio.
The internet can become the new radio. I am quite fond of looking for pre-made playlists, I will get the songs on LimeWire, listen to them, the ones I like, I buy legally, the ones I don�t I delete them. You don�t get commercials, just music. I am not saying that radio is going to dissapear completely. TV didn�t kill it. But its importance will diminish.
The problem is that the labels get the artists by the balls when they sign them up to ridiculous contracts. Your 1-4 examples look pretty good on paper, but in order to sell any significant number of copies of their music, anyone wanting it (but doesn't know it yet) has to wade through tons of (what that persons sees as) crap just to get any exposure to something they'll consider good. I'm sure there's a lot of music in the indie catalog that I would just love, but I don't have the time to wade through it all to find it. Instead, I'll listen to the radio and when I hear something I like, I'll try to pay attention to who it is. I may or may not end up buying it, or checking out what else they do, but without radio exposure, most good indie bands don't have a chance in hell of selling to anyone except those that happen to be in the bar where they're playing one weekend..
iTMS could potentially change this. There are some people that will do all the research for you (as in what is good music), then ratings will allow you to get the good songs! It�s similar (and somebody will flame me for saying this) to researching a product on Amazon or CNET, you usually look for a LCD screen, all the results pop, and you will go for the ones with the highest ratings, read the comments and eventually make up your mind. Some day you will look up for electronic music (which I love), all the DJ�s will pop, you will pick the highest rated songs or playlists (because most people like a song because other people like it), listen to their songs for free (yeah, just like radio), and then buy them if you want.
Now, if you take a look at already established and popular bands, that's a different story. Someone mentioned huge bands like Pink Floyd. Their last couple of CDs didn't need a big label to sell. People were going to buy it if they like Floyd no matter what. And in a case of that kind of popularity, the radio stations were going to play them with or without a major label. The same could be applied to other huge (classic) rock bands, as well as established artists in other music styles (country, rap, R&B, blues, etc...). Another example would be someone like Eric Clapton. He could put one out on "Clapton Records" and would sell nearly, if not exactly, the same number of CDs as he will on a major label..
I agree record labels + good music = superstars like Calpton, Floyd, U2 etc... But these bands became popular in a different time (before the internet). Internet is changing the record labels� business model, and that is what they afraid of. The new wait of creating bands and distributing their music is not as profitable for them as it used to.
Unfortunately, the number of artists (of any type of music) that could dismiss the labels and still sell as many CDs and get the same radio exposure are limited. And any new band is going to go nowhere without radio (or MTV/VH1) exposure.
Internet is offering them exposure. Right now MTV and VH1 are still popular. But YouTube, Yahoo!, MSN could become the new MTV and VH1.
Not really relevant, but interesting to think about is that most of you have probably seen the video of the ruma ruma guy (I can�t link it because I am at work and the proxie does not allow me to visit YouTube). But how many have actually seen the video for the song? YouTube made that fat kid a star, and most people probably know his face better than the guys that sing the song. Exposure.
In the end, I don't see the labels going away totally any time soon. They're in cahoots with the big FM music stations and in general, they do a good job of promoting new good bands that sign up. It's just a shame that there's really nothing to keep them from raping the artists. If there were just some way for new bands to get exposure to the masses without having to sell their souls to the labels then things would be better. Unfortunately, the Internet can only go so far in helping a new band with this.
I agree, they won�t go away anytime soon, but change is coming, and change will be good for artists and consumers, not for the record labels.
Sorry for my weird grammar or mispells, I am not a native english speaker, I don�t have a spell checker on this computer (in english at least) and I am too lazy to proof read what I wrote lol :)
The internet can become the new radio. I am quite fond of looking for pre-made playlists, I will get the songs on LimeWire, listen to them, the ones I like, I buy legally, the ones I don�t I delete them. You don�t get commercials, just music. I am not saying that radio is going to dissapear completely. TV didn�t kill it. But its importance will diminish.
The problem is that the labels get the artists by the balls when they sign them up to ridiculous contracts. Your 1-4 examples look pretty good on paper, but in order to sell any significant number of copies of their music, anyone wanting it (but doesn't know it yet) has to wade through tons of (what that persons sees as) crap just to get any exposure to something they'll consider good. I'm sure there's a lot of music in the indie catalog that I would just love, but I don't have the time to wade through it all to find it. Instead, I'll listen to the radio and when I hear something I like, I'll try to pay attention to who it is. I may or may not end up buying it, or checking out what else they do, but without radio exposure, most good indie bands don't have a chance in hell of selling to anyone except those that happen to be in the bar where they're playing one weekend..
iTMS could potentially change this. There are some people that will do all the research for you (as in what is good music), then ratings will allow you to get the good songs! It�s similar (and somebody will flame me for saying this) to researching a product on Amazon or CNET, you usually look for a LCD screen, all the results pop, and you will go for the ones with the highest ratings, read the comments and eventually make up your mind. Some day you will look up for electronic music (which I love), all the DJ�s will pop, you will pick the highest rated songs or playlists (because most people like a song because other people like it), listen to their songs for free (yeah, just like radio), and then buy them if you want.
Now, if you take a look at already established and popular bands, that's a different story. Someone mentioned huge bands like Pink Floyd. Their last couple of CDs didn't need a big label to sell. People were going to buy it if they like Floyd no matter what. And in a case of that kind of popularity, the radio stations were going to play them with or without a major label. The same could be applied to other huge (classic) rock bands, as well as established artists in other music styles (country, rap, R&B, blues, etc...). Another example would be someone like Eric Clapton. He could put one out on "Clapton Records" and would sell nearly, if not exactly, the same number of CDs as he will on a major label..
I agree record labels + good music = superstars like Calpton, Floyd, U2 etc... But these bands became popular in a different time (before the internet). Internet is changing the record labels� business model, and that is what they afraid of. The new wait of creating bands and distributing their music is not as profitable for them as it used to.
Unfortunately, the number of artists (of any type of music) that could dismiss the labels and still sell as many CDs and get the same radio exposure are limited. And any new band is going to go nowhere without radio (or MTV/VH1) exposure.
Internet is offering them exposure. Right now MTV and VH1 are still popular. But YouTube, Yahoo!, MSN could become the new MTV and VH1.
Not really relevant, but interesting to think about is that most of you have probably seen the video of the ruma ruma guy (I can�t link it because I am at work and the proxie does not allow me to visit YouTube). But how many have actually seen the video for the song? YouTube made that fat kid a star, and most people probably know his face better than the guys that sing the song. Exposure.
In the end, I don't see the labels going away totally any time soon. They're in cahoots with the big FM music stations and in general, they do a good job of promoting new good bands that sign up. It's just a shame that there's really nothing to keep them from raping the artists. If there were just some way for new bands to get exposure to the masses without having to sell their souls to the labels then things would be better. Unfortunately, the Internet can only go so far in helping a new band with this.
I agree, they won�t go away anytime soon, but change is coming, and change will be good for artists and consumers, not for the record labels.
Sorry for my weird grammar or mispells, I am not a native english speaker, I don�t have a spell checker on this computer (in english at least) and I am too lazy to proof read what I wrote lol :)
EagerDragon
Aug 26, 10:01 AM
I'm the same way. I have had .mac since way back when it was "Free for Life" and I just have gotten used to keeping it. I also keep thinking that ole Jobs and company are going to come up with the killer .mac app that will make .mac indespensible.
I'm still waiting...
hbo game of thrones map. game
game of thrones map of the
game of thrones map of the
game of thrones map of
game of thrones map. game of
I'm still waiting...
sierra oscar
Sep 19, 08:53 AM
It's not quite 0700 Cupertino time - so maybe? :)
swingerofbirch
Aug 7, 04:25 PM
Good lord. Whatever happened to simplicity? It looked like a three ring circus up there today.
Now come on. Time machine? With a picture of outer space and stars? This looks so gimmicky. They are getting to be like Microsoft and just adding new features instead of making things easier and streamlined. Why not just improve the Backup program that comes with .Mac or include it for free? Do we really need another interface? To me it looks like form over function.
Now come on. Time machine? With a picture of outer space and stars? This looks so gimmicky. They are getting to be like Microsoft and just adding new features instead of making things easier and streamlined. Why not just improve the Backup program that comes with .Mac or include it for free? Do we really need another interface? To me it looks like form over function.
theOtherGeoff
Mar 22, 04:55 PM
samsung designs and builds stuff in factories they OWN. Not all of their manufacturing is outsourced, unlike apple. Yes samsung provides ram, LCD (?), and A5 for apple's ipad. It was rumored that TSMC would also make A5 for apple so that apple is not so dependent on samsung but from what I saw in teardowns, samsung is still making some, if not all, of A5.
The difference is Samsung outsources it's OS development, it's developer community management, it's app ecosystem.
Cost competitive doesn't experience competitive.
I think for 'spec' people (hard core coders, corp types that need to control configuration), Samsung (and more importantly, when HP gets in the game HP), will compete there.... HOWEVER, this is a consumer run market, and much like a Sony WalkMan back in the day, or RollerBlades([tm]... the rest were 'inline skates'), Apple is 'defining' the market... and the rest are just knockoffs.
And unlike the old BMW pricing explanation(excuse) for Macs (equal specs and quality... from Apple HP and Dell are about the same in price) Apple is pushing iPad's experience at the BMW levels, but at Honda prices.
And RIM and samsung are pushing mid 80's GM quality against a 2012 BMW at honda prices, when the market will probably demand Kia prices for the 'experience'
The difference is Samsung outsources it's OS development, it's developer community management, it's app ecosystem.
Cost competitive doesn't experience competitive.
I think for 'spec' people (hard core coders, corp types that need to control configuration), Samsung (and more importantly, when HP gets in the game HP), will compete there.... HOWEVER, this is a consumer run market, and much like a Sony WalkMan back in the day, or RollerBlades([tm]... the rest were 'inline skates'), Apple is 'defining' the market... and the rest are just knockoffs.
And unlike the old BMW pricing explanation(excuse) for Macs (equal specs and quality... from Apple HP and Dell are about the same in price) Apple is pushing iPad's experience at the BMW levels, but at Honda prices.
And RIM and samsung are pushing mid 80's GM quality against a 2012 BMW at honda prices, when the market will probably demand Kia prices for the 'experience'
rdowns
Feb 28, 06:29 PM
Lee, I agree with you about what you say, but he clearly did say that this was only his opinion. People are allowed that, even if it is hateful and exclusionist.
Agreed, but when you air your opinions in public, others have the right to challenge them.
Agreed, but when you air your opinions in public, others have the right to challenge them.
dethmaShine
Mar 26, 09:12 AM
I use my computer as a "real computer" and I like virtually every change I've seen. I wish people wouldn't generalize so broadly and presume that because certain additions aren't something that they use that it has nothing to do with "real work."
game of thrones map hbo. game
game of thrones map. game of
ZAiPhone
Apr 5, 05:22 PM
Not again..
NAB is for broadcast professionals - its doubtful there will be computer releases here.
I've been to NAB and your statement could not be further from the truth. The FCPUG super meet is the perfect event to launch FCP. It's the largest global gathering of FCP power users.
NAB is for broadcast professionals - its doubtful there will be computer releases here.
I've been to NAB and your statement could not be further from the truth. The FCPUG super meet is the perfect event to launch FCP. It's the largest global gathering of FCP power users.
RedTomato
Sep 13, 10:11 AM
Personally, I still see data transfer, namely from storage media, as a huge bottleneck in performance. Unless you are doing something really CPU intensive (vid editing, rendering, others) Most of the average "wait-time" is the damn hard drive.
Arrays of cheap RAM on a PCIe card?
The RAM companies don't seem interested in making wodges of slow cheap hi-cap ram, only in bumping up the speed and upping the capacity. For the last 10 years, a stick of decent RAM has always been about �100/ $100 no matter what the capacity / flavour of the moment is.
Even slow RAM is still orders of magnitude faster than a HD, hence my point. There's various historical and technical factors as to why we have the current situation.
I've also looked at RAID implementations (I run a RAID5) but each RAID level has its own problems.
I've recently seen that single-user RAID3 might be one way forward for the desktop, but don't really know enough about it yet.
Arrays of cheap RAM on a PCIe card?
The RAM companies don't seem interested in making wodges of slow cheap hi-cap ram, only in bumping up the speed and upping the capacity. For the last 10 years, a stick of decent RAM has always been about �100/ $100 no matter what the capacity / flavour of the moment is.
Even slow RAM is still orders of magnitude faster than a HD, hence my point. There's various historical and technical factors as to why we have the current situation.
I've also looked at RAID implementations (I run a RAID5) but each RAID level has its own problems.
I've recently seen that single-user RAID3 might be one way forward for the desktop, but don't really know enough about it yet.
Tymmz
Aug 8, 01:09 AM
Nothing impressive really... top secrets should be good.
Time Machine is ok. It looks awful for an Apple product, what is up with that background? Ugly.
I totally agree, it looked quite ugly.
Time Machine is ok. It looks awful for an Apple product, what is up with that background? Ugly.
I totally agree, it looked quite ugly.
LegendKillerUK
Apr 6, 02:34 PM
That's a common misreading of what Jobs said.
iOS was developed for the phone first.
As Jobs explained, there was a simple UI demo done on a touch device originally designed to be a keyboard input prototype. That demo gave him the idea to go all touch on the iPhone. That's what he meant by "the tablet came first".
Since we know that during summer/fall the first iPhone UI concepts were done using iPods with wheels, his touch "eureka" moment probably came in late with the UI demo almost certainly done under OSX.
According to all known histories, the actual creation of iOS didn't begin until 2006. Prior to that, some at Apple were still proposing using Linux for the phone OS.
But he then said after how well it would work on the phone, they put the tablet project on the shelf and focused on the phone as it was more important. Which means it was a tablet and no just a touch screen device in the beginning.
iOS was developed for the phone first.
As Jobs explained, there was a simple UI demo done on a touch device originally designed to be a keyboard input prototype. That demo gave him the idea to go all touch on the iPhone. That's what he meant by "the tablet came first".
Since we know that during summer/fall the first iPhone UI concepts were done using iPods with wheels, his touch "eureka" moment probably came in late with the UI demo almost certainly done under OSX.
According to all known histories, the actual creation of iOS didn't begin until 2006. Prior to that, some at Apple were still proposing using Linux for the phone OS.
But he then said after how well it would work on the phone, they put the tablet project on the shelf and focused on the phone as it was more important. Which means it was a tablet and no just a touch screen device in the beginning.
epitaphic
Aug 19, 05:53 PM
And I'm not convinced this is only an application problem. When I run Handbrake on the Quad G5 alone it uses just over two cores 203%
So what happened to:
Both Toast and Handbrake can use 4 cores EACH
Looking at the handbrake forums, speeds seem to vary drastically between users with the same machine. Definitely seems to be affected by whatever else you have running or configured in the OS or otherwise. I suppose the "cleanest" install to test is in the Apple store (I'm just assuming they do a clean ghost copy at shutdown or end of day?)
When I ran tests on the Mac Pro at the Apple Store last Saturday between Toast and/or Handbrake, their use of more cores alone and together was much better.
So your benchmarks show the Mac Pro using 15-33% less CPU than the G5? There's no doubt that Woodcrest is a superior chip architecture to the G5 (one would hope after 3 years) and so that's why you're seeing more FPS inspite of less CPU use. But why does it use less cores though? Seems like either its a software problem OR some hardware is being maxed (I/O or FSB perhaps?)
So would it be correct to say that the only app that is even remotely "Quadcore aware" is Toast? It seems like by the time professional apps are made to take advantage of 4 cores we'll probably be on more than 8! :eek:
If only they could build something in the CPU itself that delegates tasks to n cores, we'd all be sorted. :)
So what happened to:
Both Toast and Handbrake can use 4 cores EACH
Looking at the handbrake forums, speeds seem to vary drastically between users with the same machine. Definitely seems to be affected by whatever else you have running or configured in the OS or otherwise. I suppose the "cleanest" install to test is in the Apple store (I'm just assuming they do a clean ghost copy at shutdown or end of day?)
When I ran tests on the Mac Pro at the Apple Store last Saturday between Toast and/or Handbrake, their use of more cores alone and together was much better.
So your benchmarks show the Mac Pro using 15-33% less CPU than the G5? There's no doubt that Woodcrest is a superior chip architecture to the G5 (one would hope after 3 years) and so that's why you're seeing more FPS inspite of less CPU use. But why does it use less cores though? Seems like either its a software problem OR some hardware is being maxed (I/O or FSB perhaps?)
So would it be correct to say that the only app that is even remotely "Quadcore aware" is Toast? It seems like by the time professional apps are made to take advantage of 4 cores we'll probably be on more than 8! :eek:
If only they could build something in the CPU itself that delegates tasks to n cores, we'd all be sorted. :)
ergle2
Sep 14, 10:49 PM
Really, completely new? As in, to Core 2 what the G5 was to G4? In just two years?? I guess they're really ramping things up... Core 3 Hexa Mac Pros, anyone?
Intel's stated plans as I understand them are thus:
A new micro-arch every 2 years. I don't think they mean brand new so much as "significant changes/improvements". Whether this is akin to Yonah->Conroe or Netburst->Conroe remains to be seen, but more like the former (or perhaps Pentium-M -> Merom -- Core Duo was very much a stop-gap). Little has been released about Nehalem, but at one time it was slated as "based on Banias/Dothan", due in 2005 and expected to ramp to 9/10GHz.
"Off" years will recieve derivative versions (e.g. Merom->Penryn), which appears to be mostly stuff like L2 cache increases, faster FSB speeds (at least while we have FSBs - 2008 looks like the year for DCI, finally), die shrinks, increasing the number of cores (expect at least one to be more cores on a single die instead of two dice/package), etc.
Die shrinks are currently scheduled for "off" years, in order to stablize the process ready for the new micro-arch in the following year so Intel doesn't need to deal with both new process and new arch at the same time, and presumably in part to keep speed increases coming in "off" years
Of course, roadmaps can change quite rapidly -- it's not that long ago that Whitfield was expected to debut late 2006 with DCI (FSB replacement). Whitfield was replaced by Tigerton which is now due sometime in 2007...
One thing's for sure, Intel appears to have learnt a great deal from the Netburst fiasco -- how not to do things, if nothing else. Unfortunately, they still estimate ~50% of processors shipping in 1Q2007 will be netburst-based (mostly Pentium-D).
Intel's stated plans as I understand them are thus:
A new micro-arch every 2 years. I don't think they mean brand new so much as "significant changes/improvements". Whether this is akin to Yonah->Conroe or Netburst->Conroe remains to be seen, but more like the former (or perhaps Pentium-M -> Merom -- Core Duo was very much a stop-gap). Little has been released about Nehalem, but at one time it was slated as "based on Banias/Dothan", due in 2005 and expected to ramp to 9/10GHz.
"Off" years will recieve derivative versions (e.g. Merom->Penryn), which appears to be mostly stuff like L2 cache increases, faster FSB speeds (at least while we have FSBs - 2008 looks like the year for DCI, finally), die shrinks, increasing the number of cores (expect at least one to be more cores on a single die instead of two dice/package), etc.
Die shrinks are currently scheduled for "off" years, in order to stablize the process ready for the new micro-arch in the following year so Intel doesn't need to deal with both new process and new arch at the same time, and presumably in part to keep speed increases coming in "off" years
Of course, roadmaps can change quite rapidly -- it's not that long ago that Whitfield was expected to debut late 2006 with DCI (FSB replacement). Whitfield was replaced by Tigerton which is now due sometime in 2007...
One thing's for sure, Intel appears to have learnt a great deal from the Netburst fiasco -- how not to do things, if nothing else. Unfortunately, they still estimate ~50% of processors shipping in 1Q2007 will be netburst-based (mostly Pentium-D).
Lone Deranger
Mar 31, 05:30 PM
To put it in Nelson's words: