lord patton
Aug 16, 11:31 PM
(sideshow bob)The Power PC...The!!!(/sideshow bob)
Bravo.
I don't know what ROFL stands for, but from context-clues, I'm thinking it means pretty damn funny. In which case, ROFL, dude.
Bravo.
I don't know what ROFL stands for, but from context-clues, I'm thinking it means pretty damn funny. In which case, ROFL, dude.
ssk2
Mar 22, 03:28 PM
I know I haven't been on this forum for as long as some, but this topic again proves why I'm often dissuaded from posting more regularly.
The constant foot-stomping, ridiculing without even trying, 'my Dad-is-better-than-your-Dad' attitude towards other manufacturers, the list is ongoing. How can any of us write off the Playbook or the Samsung tablet without even trying them? Yes, they are second and third to the market, but then so was Apple with the first iteration of its smartphone. Now look where we are.
The iPad two does have some shortcomings, few of which are worth going to to here. However, the OS of these devices IS crucial and we are beginning to see iOS creaking slightly. In terms of looks and notifications, for me, Apple is lagging. I like how the Playbook looks and potentially, should operate. Will I make a snap judgement? No. I'll try the damn thing first before making a judgement.
Do I see these tablets wiping out the iPad? Not a chance. Not in a million years. Do I see future versions of the Playbook and Samsung tabs wiping out the iPad? Perhaps, who can say. Mobile computing and tablets are here to stay now - saying and believing that the iPad will remain as dominant is pure wishful thinking from the more fanboy-minded of us.
The constant foot-stomping, ridiculing without even trying, 'my Dad-is-better-than-your-Dad' attitude towards other manufacturers, the list is ongoing. How can any of us write off the Playbook or the Samsung tablet without even trying them? Yes, they are second and third to the market, but then so was Apple with the first iteration of its smartphone. Now look where we are.
The iPad two does have some shortcomings, few of which are worth going to to here. However, the OS of these devices IS crucial and we are beginning to see iOS creaking slightly. In terms of looks and notifications, for me, Apple is lagging. I like how the Playbook looks and potentially, should operate. Will I make a snap judgement? No. I'll try the damn thing first before making a judgement.
Do I see these tablets wiping out the iPad? Not a chance. Not in a million years. Do I see future versions of the Playbook and Samsung tabs wiping out the iPad? Perhaps, who can say. Mobile computing and tablets are here to stay now - saying and believing that the iPad will remain as dominant is pure wishful thinking from the more fanboy-minded of us.
sisyphus
Jul 14, 02:34 PM
That's nice...
They'd better have something in between this and the iMac...
They'd better have something in between this and the iMac...
Employed Lloyd
Apr 5, 05:19 PM
I'm not trolling, this is an honest question. But isn't a Final Cut pretty much worthless for commercial use without a way to put the results on Blu-Ray?
There are plenty of ways to put FCP outputs on blu-ray.
If it's commercially worthless, that's news to the hundreds of thousands of us who make our living using it every single day.
There are plenty of ways to put FCP outputs on blu-ray.
If it's commercially worthless, that's news to the hundreds of thousands of us who make our living using it every single day.
iJohnHenry
May 3, 09:20 AM
I'll preface this by saying that I'm not a 'birther', I believe O'bama is the rightful president of the US. That said, this video, if it's true (I don't have Adobe Illustrator to verify) is pretty embarassing:
http://www.youtube.com/watch?v=7s9StxsFllY
Saw that one already, and as the guy is a self-described 'expert', I choose to wonder about his video. :rolleyes:
http://www.youtube.com/watch?v=7s9StxsFllY
Saw that one already, and as the guy is a self-described 'expert', I choose to wonder about his video. :rolleyes:
~Shard~
Jul 15, 10:49 AM
I disagree. Using ATX power supplies is a stupid idea. I am sure Apple uses higher quality power supplies than you would pick up at your local CompUSA.
If they allow this there will be a lot of dead Macs, from power supplies whose rails aren't strong enough.
Not to mention those who buy the 400W model because it is only 20 bucks and drastically underpower there Mac.
This would cause to many problems. Keep it proprietary IMO.
Actually that is a good point. Another good example is how some people install incorrect RAM into their Mac - they just pick up generic cheapo RAM, not Mac-certified, and wonder why they have all sorts of issues.
If they allow this there will be a lot of dead Macs, from power supplies whose rails aren't strong enough.
Not to mention those who buy the 400W model because it is only 20 bucks and drastically underpower there Mac.
This would cause to many problems. Keep it proprietary IMO.
Actually that is a good point. Another good example is how some people install incorrect RAM into their Mac - they just pick up generic cheapo RAM, not Mac-certified, and wonder why they have all sorts of issues.
PhantomPumpkin
Apr 27, 10:55 AM
The difference is a question of access. To get at the records kept by your cell phone provider, you need a subpoena. Any roommate/guest/thief/stalker with access to your computer or iPhone can get the data off your iphone or the backup as it exists right now. I don't mind the former, but I want to do everything I can to prevent the latter.
Keep better tabs on your phone. Encrypt the computer backup. Yeah yeah, I know sometimes we lose things. Hell, I've lost my iphone in my couch and took a half hour to find out WHERE in the couch it went.
Even still, you have to take some responsibility at some point. We can't all rely on Apple/Google/Purina Brand Puppy Chow to keep our data completely 100% safe. As they say in the IT security industry, "Your biggest threats are the end users". Technology can only go so far.
If you're REALLY paranoid, install Where's my Iphone, and if you lose it, remote wipe it.
Keep better tabs on your phone. Encrypt the computer backup. Yeah yeah, I know sometimes we lose things. Hell, I've lost my iphone in my couch and took a half hour to find out WHERE in the couch it went.
Even still, you have to take some responsibility at some point. We can't all rely on Apple/Google/Purina Brand Puppy Chow to keep our data completely 100% safe. As they say in the IT security industry, "Your biggest threats are the end users". Technology can only go so far.
If you're REALLY paranoid, install Where's my Iphone, and if you lose it, remote wipe it.
Rend It
Aug 5, 06:07 PM
snippet
Why is Front Row dependent on iSight ?
No good, clear reason. It's just that Front Row usually goes along with PhotoBooth, so.... Also, it seems that Apple might be really pushing iChat with Leopard, especially video chatting, and the iMac, MBP, and MB all have iSights. It doesn't seem too crazy to believe that perhaps Apple wants a built-in camera in all of their hardware. In the case of the Mac Pro, Xserve, and Mini, the natural place for such a device is a display.
Pure speculation, of course. :D
Why is Front Row dependent on iSight ?
No good, clear reason. It's just that Front Row usually goes along with PhotoBooth, so.... Also, it seems that Apple might be really pushing iChat with Leopard, especially video chatting, and the iMac, MBP, and MB all have iSights. It doesn't seem too crazy to believe that perhaps Apple wants a built-in camera in all of their hardware. In the case of the Mac Pro, Xserve, and Mini, the natural place for such a device is a display.
Pure speculation, of course. :D
Silentwave
Jul 14, 06:22 PM
320 would be the standard. you could upgrade to a terabyte if there are still two HDD bays.
Heck you could have 1.5TB with the new Seagate 750GB drives!
Heck you could have 1.5TB with the new Seagate 750GB drives!
ready2switch
Sep 19, 09:32 AM
It gets annoying. Why? Because it's true and most people don't want to admit it.
In a few cases here and there, the extra processor power/speed is going to help. But for a majority of people buying a MacBook, they're not going to be burning home-made DVD's, doing intense Music compositions, or using it for hard-core gaming. They're going to SURF and WRITE.
As for the "resale" value, again, most people who are buying a used MacBook are NOT going to ask "is it a Merom?" They're going to ask how nice the case is, how much use it's gotten, and how much it is, and that's it.
Everybody likes to play "ooo, I'm the hard-core computing whiz and I need the BEST out there", but I bet you if you took an honest poll out there of everyone who's answered this thread, you'd find at least 75% these Apple fans have no need for for the extra speed, they just want it because it's "cool" and "fast" and it's the latest thing out there.
62% of all statistics are made up to add false weight to the speaker's argument.
:eek:
Unless you have conducted or can site a scientific study calculating exactly how mac users USE their apple machine, stop calling other people annoying and claiming to know exactly how overpowered these systems are for "most" of the users.
In a few cases here and there, the extra processor power/speed is going to help. But for a majority of people buying a MacBook, they're not going to be burning home-made DVD's, doing intense Music compositions, or using it for hard-core gaming. They're going to SURF and WRITE.
As for the "resale" value, again, most people who are buying a used MacBook are NOT going to ask "is it a Merom?" They're going to ask how nice the case is, how much use it's gotten, and how much it is, and that's it.
Everybody likes to play "ooo, I'm the hard-core computing whiz and I need the BEST out there", but I bet you if you took an honest poll out there of everyone who's answered this thread, you'd find at least 75% these Apple fans have no need for for the extra speed, they just want it because it's "cool" and "fast" and it's the latest thing out there.
62% of all statistics are made up to add false weight to the speaker's argument.
:eek:
Unless you have conducted or can site a scientific study calculating exactly how mac users USE their apple machine, stop calling other people annoying and claiming to know exactly how overpowered these systems are for "most" of the users.
MattSepeta
Apr 27, 12:16 PM
These people never stop do they? I don't remember anyone asking bush or any other president about their educational records, plus the one time they shed light on bush's military record it just seemed to disappear into thin air.
At least new the president's chances of getting re-elected in 2012 just skyrocketed.
A few things.... Hilary did get the ball rolling before Obama was nominated...
And all presidents are plagued by these wacky conspiracy theories... GWB had his military service issues and the truther movement, WJC had "Clinton Bodycount" (arguably more insane and dark than the birther thing), Kennedy had plenty, etc...
What I dont understand is the "outrage" we are seeing over this. People claim Obama is not a citizen. Ok, well its crazy sounding, but its not dark or destructive. How about the truther movement? That is pure insidiousness.
For
At least new the president's chances of getting re-elected in 2012 just skyrocketed.
A few things.... Hilary did get the ball rolling before Obama was nominated...
And all presidents are plagued by these wacky conspiracy theories... GWB had his military service issues and the truther movement, WJC had "Clinton Bodycount" (arguably more insane and dark than the birther thing), Kennedy had plenty, etc...
What I dont understand is the "outrage" we are seeing over this. People claim Obama is not a citizen. Ok, well its crazy sounding, but its not dark or destructive. How about the truther movement? That is pure insidiousness.
For
Moyank24
Feb 28, 08:46 PM
No because heterosexuality is the default way the brain works
And your proof of this is......??
Heterosexuality is the default way your brain may work. But just because it's like that for you, doesn't mean it's like that for us all.
And your proof of this is......??
Heterosexuality is the default way your brain may work. But just because it's like that for you, doesn't mean it's like that for us all.
chasemac
Aug 7, 05:46 PM
can't believe only 8 people voted for 64bit, its the most profound change here.... all others you can achieve with some 3rd party softwares.
Same here. To me it is one of the most significant upgrades of all of them.
Same here. To me it is one of the most significant upgrades of all of them.
MattyMac
Aug 11, 11:09 AM
Yes Yes Yes
slackpacker
Apr 12, 09:25 AM
Naw, memory too. There's probably a lot I left out, it was just a quick list off the top of my head.
64bit will just expand memory access it does not have anything to do with being multiprocessor aware.
64bit will just expand memory access it does not have anything to do with being multiprocessor aware.
Reach
Sep 19, 12:05 PM
History has shown that having a product out sooner... doesn't mean you win the market.
Playstation?
The video game market is completely different, the analogy is just a stupid attempt at making people that think Apple should realease up-to-date hardware look stupid. Have fun at that, it didnt do much in convincing me that I should buy a CD when a C2D is just around the corner.
All you people trying to make us feel like complete morons for waiting and wanting a new (AND BETTER) chip, what's wrong with you?! Did you just buy a MBP and feel the need to piss on everyone that is about to get a beter machine than you? Or is it just PMS or some other hormonal condition?
Playstation?
The video game market is completely different, the analogy is just a stupid attempt at making people that think Apple should realease up-to-date hardware look stupid. Have fun at that, it didnt do much in convincing me that I should buy a CD when a C2D is just around the corner.
All you people trying to make us feel like complete morons for waiting and wanting a new (AND BETTER) chip, what's wrong with you?! Did you just buy a MBP and feel the need to piss on everyone that is about to get a beter machine than you? Or is it just PMS or some other hormonal condition?
dornoforpyros
Jul 14, 02:57 PM
eh I'm willing to bet they stick with the g5 type case, I mean the macbook is the only "new" case we've seen with the intel transition.
Meandmunch
Apr 8, 07:51 AM
I had a strange experience at Best Buy. About two days before the iPad 2 came out I went to my local Best Buy to ask about availability on release day. The employee I spoke to told me essentially that I should wait. He told me the iPad 3 was coming this fall and I should either skip the iPad 2 or purchase something like the Zoom. I pressed him how could he possible know that, I said I read all the rumor mills and such and time and time again no one actually ever knows that information. He said "they all did" (best Buy employees) it was posted on there "E-Learnings" site which is basically an internal Best Buy training/notification/product information system.
So here is an employee telling me not to purchase an iPad 2 because he thought the Zoom was better AND I should just wait because iPad 3 was coming out this fall.
WTF?
So here is an employee telling me not to purchase an iPad 2 because he thought the Zoom was better AND I should just wait because iPad 3 was coming out this fall.
WTF?
tortoise
Aug 23, 03:04 PM
Do you have a reference showing that this translates to better performance in real-world application tests in a head to head competition?
Not handy, since a lot of this happened on mailing lists.
The short version is that the memory performance scales in a very sub-linear fashion as a function of the number of cores being used, whereas Opteron scalability is almost linear up to a large number of cores. The good news is that for single dual-core processors the memory performance is on par with dual-core Opterons and their in-cache performance can be better. The bad news is that this performance does not hold as you scale cores in a system. So for some applications (e.g. those that live mostly in cache) the Woodcrest processors will be mildly faster than Opterons, but for most the performance is about even in real app benchmarks.
I've seen fairly comprehensive benchmarks for both databases and scientific computing applications, both of which thoroughly exercise the memory subsystem. Even though a single Intel core theoretically has more bandwidth, the high latency means that the real bandwidth is about the same as the slower Opterons (which have real bandwidth that approaches theoretical) and the cross-sectional bandwidth of Opterons when you get up to 4 cores and higher is much higher since the scaling is almost linear with the number of cores. For Intel, I think it was the case that a bigger cache was a cheaper design choice than a truly scalable memory subsystem. As a result, they will have different competencies. Some types of floating point codes should run very well on Intel.
Not handy, since a lot of this happened on mailing lists.
The short version is that the memory performance scales in a very sub-linear fashion as a function of the number of cores being used, whereas Opteron scalability is almost linear up to a large number of cores. The good news is that for single dual-core processors the memory performance is on par with dual-core Opterons and their in-cache performance can be better. The bad news is that this performance does not hold as you scale cores in a system. So for some applications (e.g. those that live mostly in cache) the Woodcrest processors will be mildly faster than Opterons, but for most the performance is about even in real app benchmarks.
I've seen fairly comprehensive benchmarks for both databases and scientific computing applications, both of which thoroughly exercise the memory subsystem. Even though a single Intel core theoretically has more bandwidth, the high latency means that the real bandwidth is about the same as the slower Opterons (which have real bandwidth that approaches theoretical) and the cross-sectional bandwidth of Opterons when you get up to 4 cores and higher is much higher since the scaling is almost linear with the number of cores. For Intel, I think it was the case that a bigger cache was a cheaper design choice than a truly scalable memory subsystem. As a result, they will have different competencies. Some types of floating point codes should run very well on Intel.
gorgeousninja
Apr 20, 05:54 AM
WRONG! They weren't invented at Apple's Cupertino HQ, they were invented back in Palo Alto (Xerox PARC).
Secondly, your source is a pro-Apple website. Thats a problem right there.
I'll give you a proper source, the NYTimes (http://www.nytimes.com/1989/12/20/business/xerox-vs-apple-standard-dashboard-is-at-issue.html), which wrote an article on Xerox vs Apple back in 1989, untarnished, in its raw form. Your 'source' was cherry picking data.
Here is one excerpt.
Then Apple CEO John Sculley stated:
^^ thats a GLARING admission, by the CEO of Apple, don't you think? Nevertheless, Xerox ended up losing that lawsuit, with some saying that by the time they filed that lawsuit it was too late. The lawsuit wasn't thrown out because they didn't have a strong case against Apple, but because of how the lawsuit was presented as is at the time.
I'm not saying that Apple stole IP from Xerox, but what I am saying is that its quite disappointing to see Apple fanboys trying to distort the past into making it seem as though Apple created the first GUI, when that is CLEARLY not the case. The GUI had its roots in Xerox PARC. That, is a FACT.
http://upload.wikimedia.org/wikipedia/en/7/78/Rank_Xerox_8010%2B40_brochure_front.jpg
You're really pushing this aren't you? So what exactly is your point that has a significant relevance to the main topic? ...None, that's what.
Just because 30 years ago Apple took an idea initially developed by Xerox, but then improved upon it and subsequently released to the mass market a product that most people acknowledge as being the first home computer, has absolutely no bearing on the fact that Samsung have blatantly copied Apple's design.
Secondly, your source is a pro-Apple website. Thats a problem right there.
I'll give you a proper source, the NYTimes (http://www.nytimes.com/1989/12/20/business/xerox-vs-apple-standard-dashboard-is-at-issue.html), which wrote an article on Xerox vs Apple back in 1989, untarnished, in its raw form. Your 'source' was cherry picking data.
Here is one excerpt.
Then Apple CEO John Sculley stated:
^^ thats a GLARING admission, by the CEO of Apple, don't you think? Nevertheless, Xerox ended up losing that lawsuit, with some saying that by the time they filed that lawsuit it was too late. The lawsuit wasn't thrown out because they didn't have a strong case against Apple, but because of how the lawsuit was presented as is at the time.
I'm not saying that Apple stole IP from Xerox, but what I am saying is that its quite disappointing to see Apple fanboys trying to distort the past into making it seem as though Apple created the first GUI, when that is CLEARLY not the case. The GUI had its roots in Xerox PARC. That, is a FACT.
http://upload.wikimedia.org/wikipedia/en/7/78/Rank_Xerox_8010%2B40_brochure_front.jpg
You're really pushing this aren't you? So what exactly is your point that has a significant relevance to the main topic? ...None, that's what.
Just because 30 years ago Apple took an idea initially developed by Xerox, but then improved upon it and subsequently released to the mass market a product that most people acknowledge as being the first home computer, has absolutely no bearing on the fact that Samsung have blatantly copied Apple's design.
simie
Aug 17, 05:22 AM
I think that these tests are poor regardless of the results. Testing is all based on evidence and I see none, just what they say are the results.
When you run a test you normally document the process for the test conditions. You don't just say Photoshop CS2 - MP aware actions, but which ones - why didn't they use the Photoshop test.
"For FCP 5, we rendered a 20 second HD clip we had imported and dropped into a sequence."
Does this mean they imported a 20 second clip into a sequence and had to render the clip before it would play with the rest of the sequence.
They basically used the render tools in the sequence menu. Why measure something like that.
When you run a test you normally document the process for the test conditions. You don't just say Photoshop CS2 - MP aware actions, but which ones - why didn't they use the Photoshop test.
"For FCP 5, we rendered a 20 second HD clip we had imported and dropped into a sequence."
Does this mean they imported a 20 second clip into a sequence and had to render the clip before it would play with the rest of the sequence.
They basically used the render tools in the sequence menu. Why measure something like that.
63dot
Aug 17, 11:46 AM
so if apple gets a 3 socket logic board, or a 4 socket one, we could have 12 or 16 cores.
now we are talking...processors get me so horny :)
i used to go bug my friend who worked in the field, in his past life, soldering very small widgets and thingies on motherboards and processors in the 80s and early 90s...he burned out and became a private investigator for way less money than an electrical engineer in the valley...but way more exciting since he gets to carry a gun (can anybody say midlife crisis?)
actually, my love of processors was not that great...i dropped out of a phd program in computer engineering specializing in mass networking equipment processors and chipsets...but those are in a totally different price range...and there are some exciting ideas in the world of processing using water molecules and string theory, but that's way out there right now
anyway, for my normal daily uses here at home, i am eyeing the 17" inch imac and that would actually be the best machine for me, dollar for dollar, and a truly fine machine to replace my five year old power mac
now we are talking...processors get me so horny :)
i used to go bug my friend who worked in the field, in his past life, soldering very small widgets and thingies on motherboards and processors in the 80s and early 90s...he burned out and became a private investigator for way less money than an electrical engineer in the valley...but way more exciting since he gets to carry a gun (can anybody say midlife crisis?)
actually, my love of processors was not that great...i dropped out of a phd program in computer engineering specializing in mass networking equipment processors and chipsets...but those are in a totally different price range...and there are some exciting ideas in the world of processing using water molecules and string theory, but that's way out there right now
anyway, for my normal daily uses here at home, i am eyeing the 17" inch imac and that would actually be the best machine for me, dollar for dollar, and a truly fine machine to replace my five year old power mac
mrkramer
Apr 27, 02:27 PM
First off, before the ignorant attacks begin, no I'm not a birther. I'm personally of the opinion that he was born in America and generally share the president's feelings that this is a giant waste of time.
Sorry, but your claim that you aren't a birther is like someone who says that they have a lot of friends who are black as an excuse to then say something racist. In this post and previous posts in the PRSI, you have shown that you clearly question where Obama was born.
That said, I don't think Obama should have released it, he has other more important things to do, and he's already proven his citizenship several times.
Sorry, but your claim that you aren't a birther is like someone who says that they have a lot of friends who are black as an excuse to then say something racist. In this post and previous posts in the PRSI, you have shown that you clearly question where Obama was born.
That said, I don't think Obama should have released it, he has other more important things to do, and he's already proven his citizenship several times.
fivepoint
Apr 27, 04:19 PM
It'd be fascinating to see how much people cared about 'layers' if the documents in question related to Bush's National Guard deployment or something similar. ;) Haha, no bias here boys!
The difference between me and you is that I'd want an explanation in either account. ;)
The difference between me and you is that I'd want an explanation in either account. ;)
No comments:
Post a Comment