
babyj
Sep 19, 07:57 AM
It amazes me that people who are so opposed to discussion of upcoming Merom notebooks still click on the links to the forums with titles using the terms "Merom" and "MacBook Pro". If you're a regular on the forums, sure, I can see how constant discussion about the "next" platform might get old. So ignore them. Do something productive with your time.
That isn't exactly what I said, I don't have a problem with people discussing new and upcoming products and features and when we might see them. Count me in.
Its the people that are getting so worked up, annoyed at Apple, threatening to dump the platform and move to Windows, claiming Apple are three months behind Windows systems and generally bitching.
Its all pointless as the same people will start up again with the next technology advances as soon as the Macbook range is updated with Merom.
That isn't exactly what I said, I don't have a problem with people discussing new and upcoming products and features and when we might see them. Count me in.
Its the people that are getting so worked up, annoyed at Apple, threatening to dump the platform and move to Windows, claiming Apple are three months behind Windows systems and generally bitching.
Its all pointless as the same people will start up again with the next technology advances as soon as the Macbook range is updated with Merom.

Eduardo1971
Apr 6, 10:26 AM
Boy this is great (**deadpan voice**).
Grr.
Want. Refreshed. iMac. NOW!!
:D
Grr.
Want. Refreshed. iMac. NOW!!
:D

ChrisA
Aug 16, 10:53 PM
My main interest is in FCP the FCP results.
On a fixed budget, does anyone know the advantage/disadvantage of going for the 2.0Ghz with 1900XT over 2.6Ghz with the std video card?
I think movie editing depends a lot on the speed of the disk subsystem. After all Mini DV is 12GB per hour. That's a of data. When yo "scrub" a shot all that data has to move off the disk and onto the video card. Even with 16MB of RAM not much of the video data can be help in RAM. So the G5 and Intel machine have disks that are about the same speed. Speed of a disk is measured by how fast the bit fly under the read/write head not the interface speed. So I am not surprized the Intel Mac Pro is not hugly faster for video.
On a fixed budget, does anyone know the advantage/disadvantage of going for the 2.0Ghz with 1900XT over 2.6Ghz with the std video card?
I think movie editing depends a lot on the speed of the disk subsystem. After all Mini DV is 12GB per hour. That's a of data. When yo "scrub" a shot all that data has to move off the disk and onto the video card. Even with 16MB of RAM not much of the video data can be help in RAM. So the G5 and Intel machine have disks that are about the same speed. Speed of a disk is measured by how fast the bit fly under the read/write head not the interface speed. So I am not surprized the Intel Mac Pro is not hugly faster for video.

paulvee
Aug 18, 07:38 PM
My 3.0's shipping date just changed - for no obvious reason - from 8/20 to 9/19. One month. Clearly, something just got snagged in the supply chain.
Anyone else have this?
okay, it seems to be a RAM bottleneck. I had ordered a couple of 2 gig chips from apple cause I didn't mind paying the penalty now in order not to have to sell 1 gig'ers later on.
anyway, I'm on the phone now, getting standard RAM configuration, then I'm just going to to with OtherWorld's RAM.
I wish Apple had gotten their RAM supplies in order before they started shipping. Well, what can you do.
Anyone else have this?
okay, it seems to be a RAM bottleneck. I had ordered a couple of 2 gig chips from apple cause I didn't mind paying the penalty now in order not to have to sell 1 gig'ers later on.
anyway, I'm on the phone now, getting standard RAM configuration, then I'm just going to to with OtherWorld's RAM.
I wish Apple had gotten their RAM supplies in order before they started shipping. Well, what can you do.

PCClone
Apr 27, 10:40 AM
How I create a location map on my iPad 2?
.jpg)
janstett
Sep 13, 01:37 PM
The OS takes advantage of the extra 4 cores already therefore its ahead of the technology curve, correct? Gee, no innovation here...please move along folks. :rolleyes:
As for using a Dell, sure they could've used that. Would Windows use the extra 4 cores? Highly doubtful. Microsoft has sketchy 64 bit support let alone dual core support; I'm not saying "impossible" but I haven't read jack squat about any version of Windows working well with quad cores. You think those fools (the same idiots who came up with Genuine Advantage) actually optimized their OS to run in an 8 core setup? Please pass along what you're smoking. :rolleyes:
Sorry to burst your reality distortion field, but see my previous post. I ran a dual processor Pentium II NT setup ten years ago and Windows handled it just fine THEN -- back when Apple barely supported it with a hack to its cooperatively-multitasked OS and required specially written applications with special library support.
BTW my 2 year old Smithfield handles 4 processors fine (Dual Core Pentium Extreme with hyperthreading = 4 cores).
The only limit with Windows is they keep the low end XP home to 2 processors on the same die. There is probably an architectural limit on both OSX and XP and if it's not 8 it's 16. It's probably 8.
As for using a Dell, sure they could've used that. Would Windows use the extra 4 cores? Highly doubtful. Microsoft has sketchy 64 bit support let alone dual core support; I'm not saying "impossible" but I haven't read jack squat about any version of Windows working well with quad cores. You think those fools (the same idiots who came up with Genuine Advantage) actually optimized their OS to run in an 8 core setup? Please pass along what you're smoking. :rolleyes:
Sorry to burst your reality distortion field, but see my previous post. I ran a dual processor Pentium II NT setup ten years ago and Windows handled it just fine THEN -- back when Apple barely supported it with a hack to its cooperatively-multitasked OS and required specially written applications with special library support.
BTW my 2 year old Smithfield handles 4 processors fine (Dual Core Pentium Extreme with hyperthreading = 4 cores).
The only limit with Windows is they keep the low end XP home to 2 processors on the same die. There is probably an architectural limit on both OSX and XP and if it's not 8 it's 16. It's probably 8.

Kingsly
Aug 11, 12:40 PM
:eek: :)
I hope it is released sooner than later. My Z500 only has about a month of life left in it....
I hope it is released sooner than later. My Z500 only has about a month of life left in it....

Gupster
Apr 7, 10:38 PM
:mad:Best Buy told me today that they had them in but Apple would not let them sell them. I have been going for two weeks every other day and they finally tell me they have them and can't sell them. I hate this crap. I want my IPad 2.

Dr.Gargoyle
Aug 11, 01:50 PM
I could also ask why the rest of the world doesn't get with the program and move to better technology with CDMA2000 like the US and parts of Asia have?
As I said before GSM has 81% of the market. UMTS (W-CDMA) enable hand-over back and forth UMTS and GSM. CDMA2000 can not do hand-over between GSM and CDMA2000. (See Wikipedia (http://en.wikipedia.org/wiki/W-CDMA): "The CDMA family of standards (including cdmaOne and CDMA2000) are not compatible with the W-CDMA family of standards that are based on ITU standards.")
Hence all networks that has GSM will transfer to UMTS since this decrases their initial investment as they transfer from 2/2.5G to 3G. Changing network standad is expensive, but the GSM/EDGE marketshare has been growing in US and will most likely continue to grow. At the same time CDMA is non-existant in europe.
The conclusion is simple - CDMA2000 is in the long run as dead as betamax.
As I said before GSM has 81% of the market. UMTS (W-CDMA) enable hand-over back and forth UMTS and GSM. CDMA2000 can not do hand-over between GSM and CDMA2000. (See Wikipedia (http://en.wikipedia.org/wiki/W-CDMA): "The CDMA family of standards (including cdmaOne and CDMA2000) are not compatible with the W-CDMA family of standards that are based on ITU standards.")
Hence all networks that has GSM will transfer to UMTS since this decrases their initial investment as they transfer from 2/2.5G to 3G. Changing network standad is expensive, but the GSM/EDGE marketshare has been growing in US and will most likely continue to grow. At the same time CDMA is non-existant in europe.
The conclusion is simple - CDMA2000 is in the long run as dead as betamax.

devman
Aug 6, 02:00 PM
With the iSight and IR sensor rumored to be integrated into the new line of Cinema Displays, i guess apple's gonna adopt HDMI as the IO interface, making Apple one of the first corps to do so. Plus with a HDMI enabled Mac Pro and Leopard fully support it. Why? HDMI is just like ADC, plus its an industry standard port. U need only one cable to have all the communications (FW+USB+Sound+...) going, without having to clutter yr desktop with multiple cables. I see it coming!
I think they'll go UDI instead of HDMI (and save fees). The really interesting question here though is HDCP and what means for all existing hardware including cinema displays...
I think they'll go UDI instead of HDMI (and save fees). The really interesting question here though is HDCP and what means for all existing hardware including cinema displays...

Buschmaster
Nov 29, 09:20 AM
No thanks.
I pay for my music.
Oh, according to them, you must have a Zune. Because everyone who doesn't use a Zune steals music.
This news makes me want to go steal Universal junk I don't even like.
I pay for my music.
Oh, according to them, you must have a Zune. Because everyone who doesn't use a Zune steals music.
This news makes me want to go steal Universal junk I don't even like.

falconeight
Apr 6, 03:11 PM
I bought a xoom...the salesmen started it up for me and after seeing it I changed my mind. It was my first return before I swipped my card.

DakotaGuy
Aug 11, 02:39 PM
It is more like 81% of the world market.
MS Windows has about 95% of the world market...doesn't mean the technology is better.:)
MS Windows has about 95% of the world market...doesn't mean the technology is better.:)

peeInMyPantz
Jul 28, 12:50 AM
I'm hoping for Merom news at WWDC but Fujitsu announced Merom laptops that will only be available sometime in Q4 I hope the same isn't true for the MBP.
http://www.engadget.com/2006/07/27/fujitsu-to-add-core-2-duo-options-to-lifebook-n6400-series/
at least they made an announcement.
do you think apple will try to release core 2 duo notebooks as soon as possible, before Leopard? so that once Leopard is released, more users have to buy it separately. the longer the wait, chances are there are less users that will switch from their current MBP to the new MBP knowing that Leopard's release date is soon.
http://www.engadget.com/2006/07/27/fujitsu-to-add-core-2-duo-options-to-lifebook-n6400-series/
at least they made an announcement.
do you think apple will try to release core 2 duo notebooks as soon as possible, before Leopard? so that once Leopard is released, more users have to buy it separately. the longer the wait, chances are there are less users that will switch from their current MBP to the new MBP knowing that Leopard's release date is soon.

mirko.meschini
Apr 7, 02:47 AM
nVidia 320M si about 20W, so they can use 17W processors on 11,6" and 25W processors on 13", with an increased battery life on both models.

furi0usbee
Mar 26, 06:48 PM
Windows manages to run legacy apps still. Even if you do have to resort to using the virtual machine they've called 'XP Mode.'
Because Windows is bloatware. I don't want my Mac OS to be able to run **** 10 years old. This only hampers innovation by having to spend time making sure all the old stuff doesn't break. Move on my friend. I can probably use XP to print to a 15 year old dot matrix printer.
Because Windows is bloatware. I don't want my Mac OS to be able to run **** 10 years old. This only hampers innovation by having to spend time making sure all the old stuff doesn't break. Move on my friend. I can probably use XP to print to a 15 year old dot matrix printer.

Raid
Apr 28, 11:04 AM
I really have nothing to add to this thread, the whole thing was silly from the get go and is just a fantastic example of how American politics is more show than substance. (and a over-the-top- soap opera at that!)
But I saw this today and thought I would share:
http://cheezfailbooking.files.wordpress.com/2011/04/funny-facebook-fails-doubting-thomas1.jpg
You may now continue distract yourselves from real issues.
But I saw this today and thought I would share:
http://cheezfailbooking.files.wordpress.com/2011/04/funny-facebook-fails-doubting-thomas1.jpg
You may now continue distract yourselves from real issues.

lsvtecjohn3
Mar 22, 02:09 PM
Lack of Flash support is the achilles heel of iPad. I hope Jobs gets off his high horse and relents.
He's not because of the iPad, iPhone and iPod touch they're pushing HTML5 forward
http://www.macrumors.com/2010/10/27/54-of-h-264-web-video-now-available-in-html5/
He's not because of the iPad, iPhone and iPod touch they're pushing HTML5 forward
http://www.macrumors.com/2010/10/27/54-of-h-264-web-video-now-available-in-html5/

fivepoint
Apr 27, 04:19 PM
It'd be fascinating to see how much people cared about 'layers' if the documents in question related to Bush's National Guard deployment or something similar. ;) Haha, no bias here boys!
The difference between me and you is that I'd want an explanation in either account. ;)
The difference between me and you is that I'd want an explanation in either account. ;)
dgree03
Apr 6, 02:43 PM
That's what I've gone for, Wifi only. With the wireless hotspot feature of the Nexus S, a 3G version seemed pointless for me.
I thought the same thing, until i bought my 3g Xoom. Then i felt finally freedom! I have a rooted EVO and with my ipad 1 I would tether all the time. Take my phone out, start wireless tether, put my phone back, kill my phone battery.. rinse and repeat.
Now I dont have to kill my phone battery tethering, nor do I have to deal with the hassle of enabling tether on my phone all the time.
I thought the same thing, until i bought my 3g Xoom. Then i felt finally freedom! I have a rooted EVO and with my ipad 1 I would tether all the time. Take my phone out, start wireless tether, put my phone back, kill my phone battery.. rinse and repeat.
Now I dont have to kill my phone battery tethering, nor do I have to deal with the hassle of enabling tether on my phone all the time.
hcho3
Apr 19, 02:25 PM
Samsung forgot to copy apple and put the lock/power button on the side.
Lock/Power button belongs on the top of the device.
If you look at Nexus S, samsung really did copy apple's box design.
If you look at their phone/alarm/clock icons, they copied.
Samsung has no chance of winning this lawsuit. Apple was preparing to sue samsung for a long time. They just needed time to prepare.
Lock/Power button belongs on the top of the device.
If you look at Nexus S, samsung really did copy apple's box design.
If you look at their phone/alarm/clock icons, they copied.
Samsung has no chance of winning this lawsuit. Apple was preparing to sue samsung for a long time. They just needed time to prepare.
yoak
Apr 11, 08:28 AM
Then that just begs the question, "why haven't these people left already?" FCP has been fairly stagnant for years. There are plenty of other alternatives, so doesn't that kinda make them fanboyish too for sticking it out when up to this point Apple has given zero hints about when or how it will take FCP to the next level?
I'm not in the video editing biz, but if the pro s/w I use in my profession hobbled my efficiency and workflow the way you are carping about FCP, and there were viable alternatives, I would abandon it quicker than pigeon can snatch a bread crumb. Just sayin'.
It's costly to change. It takes time to learn new software, time that could be spent working instead. Then it's all the money already invested in the platform.
At least here, premiere is not really an option if you work in broadcast or film since everyone either use final cut or avid
I'm not in the video editing biz, but if the pro s/w I use in my profession hobbled my efficiency and workflow the way you are carping about FCP, and there were viable alternatives, I would abandon it quicker than pigeon can snatch a bread crumb. Just sayin'.
It's costly to change. It takes time to learn new software, time that could be spent working instead. Then it's all the money already invested in the platform.
At least here, premiere is not really an option if you work in broadcast or film since everyone either use final cut or avid
mwswami
Jul 20, 11:56 AM
See http://www.anandtech.com/IT/showdoc.aspx?i=2772 for comparison of Woodcrest, Opteron, and Ultrasparc T1.
Dual Woodcrest (4 threads) easily outperformed Ultrasparc T1 (32 threads). The power consumption of the dual 3.0GHz Woodcrest system came out to be 245W compared to 188W for the Sun T2000 with 8-core Ultrasparc T1. But, the metric that's most important is performance/watt and that's where Woodcrest came out as a clear winner.
Dual Woodcrest (4 threads) easily outperformed Ultrasparc T1 (32 threads). The power consumption of the dual 3.0GHz Woodcrest system came out to be 245W compared to 188W for the Sun T2000 with 8-core Ultrasparc T1. But, the metric that's most important is performance/watt and that's where Woodcrest came out as a clear winner.
DStaal
Sep 13, 09:12 AM
A bit pointless given that no software utilises the extra cores yet. But nice to know, I guess.
Mac OS X distributes threads and processes across cores/CPUs to optimize performance already. (Subject to some limitations, as noted already.)
Many Mac programs which can benifit from mutiple threads already use this, and will automatically get boosts from 8 cores depending on the amount of cocurrency they support.
On the other hand, not everything is suitable for cocurrent execution. Photoshop editing an image would love to have a core per pixel. BBEdit couldn't care less, most likely. It all depends on what you are doing.
Plenty of Mac software would use the extra cores, if they were avalible.
(Note: I keep specifying 'Mac' here. There is a reason. Windows isn't as good at multithreading/processing yet...)
Mac OS X distributes threads and processes across cores/CPUs to optimize performance already. (Subject to some limitations, as noted already.)
Many Mac programs which can benifit from mutiple threads already use this, and will automatically get boosts from 8 cores depending on the amount of cocurrency they support.
On the other hand, not everything is suitable for cocurrent execution. Photoshop editing an image would love to have a core per pixel. BBEdit couldn't care less, most likely. It all depends on what you are doing.
Plenty of Mac software would use the extra cores, if they were avalible.
(Note: I keep specifying 'Mac' here. There is a reason. Windows isn't as good at multithreading/processing yet...)
No comments:
Post a Comment