wpotere
Apr 27, 09:49 AM
Did he release a different form of the document today?
To catch you up... There are two versions of the form, the long and short. The long form is the one that you are used to seeing and the short is an abbreviated version that is just a as legal. Many states are using the short now including Hawaii. If he is holding a clearance (which he is), he will have had a background check and these documents would have been submitted. Basically he was cleared with no problems. This is just people raising hell of something stupid.
To catch you up... There are two versions of the form, the long and short. The long form is the one that you are used to seeing and the short is an abbreviated version that is just a as legal. Many states are using the short now including Hawaii. If he is holding a clearance (which he is), he will have had a background check and these documents would have been submitted. Basically he was cleared with no problems. This is just people raising hell of something stupid.
nplima
Nov 29, 08:43 AM
What Universal really wants is someone to sue them for slander. Stating in public that all iPod owners are thieves is rude to say the least. I bet that if I had similar public attention and went on to say that all RIAA members are mobsters, I'd be in trouble.
mkruck
Apr 6, 03:06 PM
Yeesh dude, at least your wife cares enough to do nice things for you. :(
Yes, and my response that you quoted was said tongue in cheek. People really need to lighten up and stop taking themselves so seriously.
Sent from my Xoom using Tapatalk
Yes, and my response that you quoted was said tongue in cheek. People really need to lighten up and stop taking themselves so seriously.
Sent from my Xoom using Tapatalk
Evangelion
Sep 14, 01:14 AM
Didn't you get the memo, Hyperthreading was a joke.
At worst, it slowed performance down by few percent. At best, it gave substantial boost in performance. And multitasking-tests clearly benefitted from HyperThreading. That said, Intel dropped it, because it apparently consumed too much power. But we might see HT in some future Intel-CPU's at some point, you never know.
HT as such is not a bad idea. Sun UltraSparc T1 uses such a scheme extensively.
At worst, it slowed performance down by few percent. At best, it gave substantial boost in performance. And multitasking-tests clearly benefitted from HyperThreading. That said, Intel dropped it, because it apparently consumed too much power. But we might see HT in some future Intel-CPU's at some point, you never know.
HT as such is not a bad idea. Sun UltraSparc T1 uses such a scheme extensively.
puckhead193
Aug 17, 12:27 AM
i went to my local apple store, and holy crap the thing is really fast. I'm tempted to get one, instead of an iMac, the only thing that's holding me back is the size.
tk421
Nov 29, 10:44 AM
If all of you on here bought all of your music either from iTunes or from a record store, then, absolutely, complain away if that dollar is passed on to you. But, which is likely in just about every case, you have a few songs you burned off a friend's CD or downloaded from a file-sharing site, then shut up, you are the reason this is necessary.
I guess I understand this. We all pay a little more on purchases to make up for shoplifting. But all of my music is legal, and I think this is a very bad move.
As others have pointed out, I doubt any of this money will actually end up in the hands of artists. And who decides which artists? And what about smaller labels? Nobody will be compensating them. My brother is unsigned. Who will pay him for the illegal copies of his music that I know exist? It seems to me, the artists getting the money (if any do) will be the ones that already sell the most and therefore are struggling the least.
To be clear, I strongly oppose stealing music. I also strongly oppose calling all music listeners thieves and charging us all for it. And I'm all for the blacklist, and I'll gladly tell Universal I'm through with their music!
Universal Music Group:
USA (212) 841 8000
France +33 1 44 41 91 91
UK +44 0 20 77 47 4000
feedback_fr@vivendi.com
I guess I understand this. We all pay a little more on purchases to make up for shoplifting. But all of my music is legal, and I think this is a very bad move.
As others have pointed out, I doubt any of this money will actually end up in the hands of artists. And who decides which artists? And what about smaller labels? Nobody will be compensating them. My brother is unsigned. Who will pay him for the illegal copies of his music that I know exist? It seems to me, the artists getting the money (if any do) will be the ones that already sell the most and therefore are struggling the least.
To be clear, I strongly oppose stealing music. I also strongly oppose calling all music listeners thieves and charging us all for it. And I'm all for the blacklist, and I'll gladly tell Universal I'm through with their music!
Universal Music Group:
USA (212) 841 8000
France +33 1 44 41 91 91
UK +44 0 20 77 47 4000
feedback_fr@vivendi.com
Texas04
Nov 28, 06:29 PM
That would add already to the money that they get from the purchased music.. Apple will not allow this... at least they shouldnt, and wouldnt Universal be happy as is?
Microsoft started this and it is a good hit into Apple... but Apple has a agreement and will not break that agreement... especially to get rid of the ease of 99 cent standard pricing
Microsoft started this and it is a good hit into Apple... but Apple has a agreement and will not break that agreement... especially to get rid of the ease of 99 cent standard pricing
KT Walrus
Apr 7, 10:58 PM
I know some Apple Stores hold back iPad 2 stock for "special customers". I was talking to a retired school teacher who had a contact at an Apple Store and she said she got her iPad 2 by having her contact hold one for her when he could. She got hers a few days after they first went on sale when her contact called and all she had to do was pick it up at her convenience.
Best Buy employees aren't the only ones setting aside stock of iPad 2s. It isn't about first come first served, but who you know.
Best Buy employees aren't the only ones setting aside stock of iPad 2s. It isn't about first come first served, but who you know.
appleguy123
Feb 28, 08:43 PM
No because heterosexuality is the default way the brain works
Isn't it all hormonal mishaps in the womb? Does your God control that? If so, he is predisposing people to sin, and isn't that unfair that not all are exposed to that disposition?
Isn't it all hormonal mishaps in the womb? Does your God control that? If so, he is predisposing people to sin, and isn't that unfair that not all are exposed to that disposition?
rtdunham
Apr 27, 09:49 AM
I'm old-fashined I guess because I have no interest in having a smartphone in the first place. I just have a standard flip-phone. By owning a smartphone, you are always going to be faced with privacy issues...
Did you know dumb phones record every call you make? That they record who you call, and how long you talk to them? That when landlines are involved, nubmers are recorded that pinpoint the location? That your phone transmits that information to your phone company? Look at your next phone bill. Your standard flip phone even records who calls YOU and tells THAT to your phone company, too. AND if you lose your phone bill--as is the case if you lose your phone--all that data's available, in unencrypted form, to anyone and everyone!
My take: Yeah, the data should've been encrypted, and prudence would have had it deleted after a short time. They're fixing that now. But it serves a purpose we all value, facilitating calling and optimizing location services when we want them. It's a glitch, nothing more, exaggerated by media attention (and i'm part of the media, so I'm not unfairly finger-pointing) just as happened with antenna-gate and the fuss over Toyotas accelerating out of control (where almost always the conclusion is someone put their foot on the accelerator instead of the brake, by mistake). Ten years from now someone will write an entertaining book about the gap between public hysteria and reality on these issues and many others (birtherism, anyone? or if your political views swing in a different way, government spending way beyond its means?)
I'm not saying the location database is operator error. Clearly not. I'm just trying to keep it in perspective. (It's not time-stamped? It's accurate sometimes only to 50 or 81 miles, as in cases reported in this thread? My phone, using the data that's recorded, consistently puts me five miles from my home, in a different county, across a river, four or five cities away, due to some oddity of cell tower location).
Look, your credit cards not only keep track of where you've been, but how much you spent there, and when, with precise geographic accuracy. Sometimes they even tell what you've bought. Just look at your next bill. Did you know your bank keeps track of every check you write, and to whom, and sends that information to you unencrypted via the mail? Did you know...
I think we should keep this situation in perspective. Too many people here see the privacy sky falling on them, when they're really swimming in it. (Did you know the device you're using to read this doesn't protect you from being victimized by horrible unencrypted metaphors...?)
Did you know dumb phones record every call you make? That they record who you call, and how long you talk to them? That when landlines are involved, nubmers are recorded that pinpoint the location? That your phone transmits that information to your phone company? Look at your next phone bill. Your standard flip phone even records who calls YOU and tells THAT to your phone company, too. AND if you lose your phone bill--as is the case if you lose your phone--all that data's available, in unencrypted form, to anyone and everyone!
My take: Yeah, the data should've been encrypted, and prudence would have had it deleted after a short time. They're fixing that now. But it serves a purpose we all value, facilitating calling and optimizing location services when we want them. It's a glitch, nothing more, exaggerated by media attention (and i'm part of the media, so I'm not unfairly finger-pointing) just as happened with antenna-gate and the fuss over Toyotas accelerating out of control (where almost always the conclusion is someone put their foot on the accelerator instead of the brake, by mistake). Ten years from now someone will write an entertaining book about the gap between public hysteria and reality on these issues and many others (birtherism, anyone? or if your political views swing in a different way, government spending way beyond its means?)
I'm not saying the location database is operator error. Clearly not. I'm just trying to keep it in perspective. (It's not time-stamped? It's accurate sometimes only to 50 or 81 miles, as in cases reported in this thread? My phone, using the data that's recorded, consistently puts me five miles from my home, in a different county, across a river, four or five cities away, due to some oddity of cell tower location).
Look, your credit cards not only keep track of where you've been, but how much you spent there, and when, with precise geographic accuracy. Sometimes they even tell what you've bought. Just look at your next bill. Did you know your bank keeps track of every check you write, and to whom, and sends that information to you unencrypted via the mail? Did you know...
I think we should keep this situation in perspective. Too many people here see the privacy sky falling on them, when they're really swimming in it. (Did you know the device you're using to read this doesn't protect you from being victimized by horrible unencrypted metaphors...?)
mirko.meschini
Apr 7, 02:47 AM
nVidia 320M si about 20W, so they can use 17W processors on 11,6" and 25W processors on 13", with an increased battery life on both models.
NJRonbo
Jun 14, 06:58 PM
Why on earth would Radio Shack ask anyone
to stand on line tomorrow to get a PIN just
to stand on line again opening day to get a phone
for which you are not guaranteed for?
What kind of crap is that?
The problem is, each store has a different
opinion on the reservation policy.
I think I am going to order directly from Apple.
Problem is, I have a $247 credit from Radio
Shack and I don't even shop at their stores
anymore.
to stand on line tomorrow to get a PIN just
to stand on line again opening day to get a phone
for which you are not guaranteed for?
What kind of crap is that?
The problem is, each store has a different
opinion on the reservation policy.
I think I am going to order directly from Apple.
Problem is, I have a $247 credit from Radio
Shack and I don't even shop at their stores
anymore.
Hellhammer
Apr 8, 09:01 AM
The trouble is .. I find the TDP numbers for Sandy Bridge very misleading. For example the previous i7 2.66Ghz dual core had a TDP of 35W and the current i7 2.2Ghz quad core has a TDP of 45W. Theoretically, it should only use 10W more when doing CPU intensive task, but according to anandtech who measured the task, the i7 Sandy Bridge Quad core was using almost 40W more when running cinebench.
http://www.anandtech.com/show/4205/the-macbook-pro-review-13-and-15-inch-2011-brings-sandy-bridge/14
It just doesn't make any sense. Going by those figures, if the i7 dual core was 35W, the i7 Sandy Bridge quad core would be around 70W.
Not sure how this relates to potential MacBook Air Sandy Bridge processors, but keep in mind.. there must be a reason why Samsung went for the ULV processor in their 13" laptop instead of the LV one.
CPU isn't the only thing that changed. AMD 6750M (~30W) has higher TDP than NVidia GT 330M (~23W). I had to put ~ because their TDPs are not officially stated by AMD or NVidia so it's just based on previous GPUs and their TDPs. The point is that AMD 6750M has higher TDP.
There is also another thing. TDP is not the maximum power draw. Maximum power dissipation is usually 20-30% more than the actual TDP. While MPD is rarely achieved as it requires maximum voltage and temperature, it can (nearly) be achieved with heavy benchmarking applications.
For example, the combined TDP from quad core SB and AMD 6750M is 75W. If we use 20% extra as the MPD, that is 90W, just from the CPU and GPU! Of course those parts are not using 90W in that test because things like screen, HD, RAM etc need power too. As the MPD is usually in percents, it can explain why the difference is so big in watts.
40W sounds a bit too much to explain with MPD though. IIRC the GT 330M is underclocked but I'm not 100% sure. You have a valid point that the SBs may be using more power than their predecessors. To make this more accurate, we should compare them with C2Ds though ;)
I guess we will have to wait and see, but an ULV in 13" would be more than a disappointment.
http://www.anandtech.com/show/4205/the-macbook-pro-review-13-and-15-inch-2011-brings-sandy-bridge/14
It just doesn't make any sense. Going by those figures, if the i7 dual core was 35W, the i7 Sandy Bridge quad core would be around 70W.
Not sure how this relates to potential MacBook Air Sandy Bridge processors, but keep in mind.. there must be a reason why Samsung went for the ULV processor in their 13" laptop instead of the LV one.
CPU isn't the only thing that changed. AMD 6750M (~30W) has higher TDP than NVidia GT 330M (~23W). I had to put ~ because their TDPs are not officially stated by AMD or NVidia so it's just based on previous GPUs and their TDPs. The point is that AMD 6750M has higher TDP.
There is also another thing. TDP is not the maximum power draw. Maximum power dissipation is usually 20-30% more than the actual TDP. While MPD is rarely achieved as it requires maximum voltage and temperature, it can (nearly) be achieved with heavy benchmarking applications.
For example, the combined TDP from quad core SB and AMD 6750M is 75W. If we use 20% extra as the MPD, that is 90W, just from the CPU and GPU! Of course those parts are not using 90W in that test because things like screen, HD, RAM etc need power too. As the MPD is usually in percents, it can explain why the difference is so big in watts.
40W sounds a bit too much to explain with MPD though. IIRC the GT 330M is underclocked but I'm not 100% sure. You have a valid point that the SBs may be using more power than their predecessors. To make this more accurate, we should compare them with C2Ds though ;)
I guess we will have to wait and see, but an ULV in 13" would be more than a disappointment.
janstett
Oct 23, 11:44 AM
Unfortunately not many multithreaded apps - yet. For a long time most of the multi-threaded apps were just a select few pro level things. 3D/Visualization software, CAD, database systems, etc.. Those of us who had multiprocessor systems bought them because we had a specific software in mind or group of software applications that could take advantage of multiple processors. As current CPU manufacturing processes started hitting a wall right around the 3GHz mark, chip makers started to transition to multiple CPU cores to boost power - makes sense. Software developers have been lazy for years, just riding the wave of ever-increasing MHz. Now the multi-core CPUs are here and the software is behind as many applications need to have serious re-writes done in order to take advantage of multiple processors. Intel tried to get a jump on this with their HT (Hyper Threading) implementation that essentially simulated dual-cores on a CPU by way of two virtual CPUs. Software developers didn't exactly jump on this and warm up to it. But I also don't think the software industry truly believed that CPUs would go multi-core on a mass scale so fast... Intel and AMD both said they would, don't know why the software industry doubted. Intel and AMD are uncommonly good about telling the truth about upcoming products. Both will be shipping quad-core CPU offerings by year's end.
What you're saying isn't entirely true and may give some people the wrong idea.
First, a multicore system is helpful when running multiple CPU-intensive single-threaded applications on a proper multitasking operating system. For example, right now I'm ripping CDs on iTunes. One processor gets used a lot and the other three are idle. I could be using this CPU power for another app.
The reality is that to take advantage of multiple cores, you had to take advantage of threads. Now, I was doing this in my programs with OS/2 back in 1992. I've been writing multithreaded apps my entire career. But writing a threaded application requires thought and work, so naturally many programmers are lazy and avoid threads. Plus it is harder to debug and synchronize a multithreaded application. Windows and Linux people have been doing this since the stone age, and Windows/Linux have had usable multiprocessor systems for more than a decade (it didn't start with Hyperthreading). I had a dual-processor 486 running NT 3.5 circa 1995. It's just been more of an optional "cool trick" to write threaded applications that the timid programmer avoids. Also it's worth noting that it's possible to go overboard with excessive threading and that leads to problems (context switching, thrashing, synchronization, etc).
Now, on the Mac side, OS 9 and below couldn't properly support SMP and it required a hacked version of the OS and a special version of the application. So the history of the Mac world has been, until recently with OSX, to avoid threading and multiprocessing unless specially called for and then at great pain to do so.
So it goes back to getting developers to write threaded applications. Now that we're getting to 4 and 8 core systems, it also presents a problem.
The classic reason to create a thread is to prevent the GUI from locking up while processing. Let's say I write a GUI program that has a calculation that takes 20 seconds. If I do it the lazy way, the GUI will lock up for 20 seconds because it can't process window messages during that time. If I write a thread, the calculation can take place there and leave the GUI thread able to process messages and keep the application alive, and then signal the other thread when it's done.
But now with more than 4 or 8 cores, the problem is how do you break up the work? 9 women can't have a baby in a month. So if your process is still serialized, you still have to wait with 1 processor doing all the work and the others sitting idle. For example, if you encode a video, it is a very serialized process. I hear some work has been done to simultaneously encode macroblocks in parallel, but getting 8 processors to chew on a single video is an interesting problem.
What you're saying isn't entirely true and may give some people the wrong idea.
First, a multicore system is helpful when running multiple CPU-intensive single-threaded applications on a proper multitasking operating system. For example, right now I'm ripping CDs on iTunes. One processor gets used a lot and the other three are idle. I could be using this CPU power for another app.
The reality is that to take advantage of multiple cores, you had to take advantage of threads. Now, I was doing this in my programs with OS/2 back in 1992. I've been writing multithreaded apps my entire career. But writing a threaded application requires thought and work, so naturally many programmers are lazy and avoid threads. Plus it is harder to debug and synchronize a multithreaded application. Windows and Linux people have been doing this since the stone age, and Windows/Linux have had usable multiprocessor systems for more than a decade (it didn't start with Hyperthreading). I had a dual-processor 486 running NT 3.5 circa 1995. It's just been more of an optional "cool trick" to write threaded applications that the timid programmer avoids. Also it's worth noting that it's possible to go overboard with excessive threading and that leads to problems (context switching, thrashing, synchronization, etc).
Now, on the Mac side, OS 9 and below couldn't properly support SMP and it required a hacked version of the OS and a special version of the application. So the history of the Mac world has been, until recently with OSX, to avoid threading and multiprocessing unless specially called for and then at great pain to do so.
So it goes back to getting developers to write threaded applications. Now that we're getting to 4 and 8 core systems, it also presents a problem.
The classic reason to create a thread is to prevent the GUI from locking up while processing. Let's say I write a GUI program that has a calculation that takes 20 seconds. If I do it the lazy way, the GUI will lock up for 20 seconds because it can't process window messages during that time. If I write a thread, the calculation can take place there and leave the GUI thread able to process messages and keep the application alive, and then signal the other thread when it's done.
But now with more than 4 or 8 cores, the problem is how do you break up the work? 9 women can't have a baby in a month. So if your process is still serialized, you still have to wait with 1 processor doing all the work and the others sitting idle. For example, if you encode a video, it is a very serialized process. I hear some work has been done to simultaneously encode macroblocks in parallel, but getting 8 processors to chew on a single video is an interesting problem.
Willis
Aug 26, 05:43 PM
If the power consumption is the same... does that mean that the Merom and the current chips suck the same amount energy while going full throttle?
If the above is true, if you turned down the Merom to match the speed of the current chips, wouldn't the Merom be drawing 20% less power?
In other words if the Merom and the current chip were both going 60 mph down the freeway, would the Merom be drawing less power?
Am I missing something here (such as the basics of electricity, the basic way that chips work, etc.)?
512ke
no.. what it means is that the chip is 20% more efficient using the same amount of power... Some have said that the chips do run a bit cooler because they are more efficient, but until they come out in the MBP... who knows?
If the above is true, if you turned down the Merom to match the speed of the current chips, wouldn't the Merom be drawing 20% less power?
In other words if the Merom and the current chip were both going 60 mph down the freeway, would the Merom be drawing less power?
Am I missing something here (such as the basics of electricity, the basic way that chips work, etc.)?
512ke
no.. what it means is that the chip is 20% more efficient using the same amount of power... Some have said that the chips do run a bit cooler because they are more efficient, but until they come out in the MBP... who knows?
NebulaClash
Apr 6, 01:29 PM
This can't be right. MR posters have assured me that the Xoom is better than the iPad. I mean, if you can't trust MR posters, whom can you trust?
mactoday
Apr 6, 10:55 AM
Since you have no clue how the sandy bridge airs will perform, I'll take your statement as FUD.
Actually 320m performs better then Intel 3000, so the dude is right that graphics chip in SB is slower.
Actually 320m performs better then Intel 3000, so the dude is right that graphics chip in SB is slower.
Nuck81
Aug 13, 12:21 AM
It's refreshing that I don't have to go to gamespot forums to see a pointless immature fanboy pissing match :rolleyes:
joeboy_45101
Nov 28, 09:25 PM
It doesn't cost the consumer any more, why wouldn't you want the people who actually make the music you are listening to get compensated?
This debate is stale. People want something for nothing.
Wow! Where did you ever learn that from the MYASS School of ********! Hey here's an idea, since most of the music these companies produce is mastered and remastered on Mac workstations then why shouldn't Apple be able to come back and get some extra dough off of that. I mean you wouldn't want these record labels making something for nothing, now would you?
This debate is stale. People want something for nothing.
Wow! Where did you ever learn that from the MYASS School of ********! Hey here's an idea, since most of the music these companies produce is mastered and remastered on Mac workstations then why shouldn't Apple be able to come back and get some extra dough off of that. I mean you wouldn't want these record labels making something for nothing, now would you?
portishead
Apr 12, 12:28 AM
Here is my wish list:
RGB 444 10-bit support. Final Cut can't properly render RGB 10-bit material.
Real 3:2 pulldown and not 2:2:2:4 like it currently is.
Quicktime sucks. It needs better audio track support (5.1), subtitles etc. I think we're going to see AV Foundation from now on. There needs to be a real Quicktime Pro, that's better than what it currently is.
Compressor is just bad it needs to be redone.
64-bit, Open CL, blah blah
Project based workflow, instead of capture scratch folders
Better interface.
I like Motion, just wish the timeline was a little better.
RGB 444 10-bit support. Final Cut can't properly render RGB 10-bit material.
Real 3:2 pulldown and not 2:2:2:4 like it currently is.
Quicktime sucks. It needs better audio track support (5.1), subtitles etc. I think we're going to see AV Foundation from now on. There needs to be a real Quicktime Pro, that's better than what it currently is.
Compressor is just bad it needs to be redone.
64-bit, Open CL, blah blah
Project based workflow, instead of capture scratch folders
Better interface.
I like Motion, just wish the timeline was a little better.
rockthecasbah
Aug 7, 11:07 PM
i liked all of the features but picked Time Machine because it just makes it so much easier to back up. Who cares if it isn't the most original thign ever? It's easy to use, integrated, and useful. :)
slooksterPSV
Aug 7, 02:07 PM
I can't wait till spring for Leopard. That's too long, I want Leopard now :D :D :D come on Steve, give us Leopard!
gnomeisland
Apr 27, 08:18 AM
I wish they would leave it on and let me use it. I consider it a feature. It would help me track hours at job sites automatically for billing. I thought of writing an app just for that.
That's an interesting idea.
I actually like Apple's response. I do think that being able to turn OFF the feature was an oversight on their part but I do wish there was a way to leave it on. I'd actually welcome a way to import that database into Aperture and use it geotag my photos. Yes, there are apps to do that but I have an iPhone 3G and so backgrounding those apps isn't really possible.
That's an interesting idea.
I actually like Apple's response. I do think that being able to turn OFF the feature was an oversight on their part but I do wish there was a way to leave it on. I'd actually welcome a way to import that database into Aperture and use it geotag my photos. Yes, there are apps to do that but I have an iPhone 3G and so backgrounding those apps isn't really possible.
puggles
Jun 14, 07:42 PM
ok definitely not going to radio shack... they changed the time from 7AM to 1PM and are now giving out pins which will put your name on a list and they will call you as they are received to the store.... definitely not guaranteed! They also seemed really desperate for my business. Im guessing they also made the 1PM time so you will miss other pre orders and be stuck with them....unless you can pre order with apple and radio shack and cancel the apple one if radio shack does work out?
No comments:
Post a Comment