Comments Locked

26 Comments

Back to Article

  • dcollins - Monday, April 30, 2012 - link

    All these Ivy Bridge systems are making me jealous. Does anyone know when we can except retail availability for these chips?
  • 8steve8 - Tuesday, May 1, 2012 - link

    i bought a 3570k yesterday
  • Grizzlebee - Tuesday, May 1, 2012 - link

    Me too!
  • aguilpa1 - Tuesday, May 1, 2012 - link

    I have an M17X R3 with 2860QM and 580GTX and really other then perhaps a lower power draw, performance won't be any different then a the GTX580m
  • bennyg - Wednesday, May 2, 2012 - link

    erm, no, the 7970M prelim benchmarks are a good 30% higher?

    Agree on IB CPUs though, their biggest leap is in the iGPU with minimal IPC increase. And with overclocking, especially with volts, power consumption is much much higher.

    Performance increases with each new Intel CPU gen are limited, as they have NO competition in the high end to worry about, no Netburst=>Conroe revolution has been required, such is their dominance.
  • aguilpa1 - Wednesday, May 2, 2012 - link

    I was referring to the M17X with the 675M's. I would think at some point a 680M version will become available that will likely be...30% faster, look what a coincidence.
  • ViperV990 - Monday, April 30, 2012 - link

    With the M17x, Alienware doesn't seem to allow you to pair the 7970M with the 120Hz display. What a shame!
  • JarredWalton - Monday, April 30, 2012 - link

    Correct, the 120Hz LCD is designed for NVIDIA's 3D Vision bundle. I guess the thought process is that if you're getting a 3D display you'd want to use it. In reality, I know quite a few people that would just like a 120Hz 2D display.
  • Meaker10 - Tuesday, May 1, 2012 - link

    Its a weird incompatibility issue with shutting down the integrated graphics chip and running 120hz on AMD.
  • oshogg - Tuesday, May 1, 2012 - link

    Where have all those 1920x1200 resolutions disappeared? And, why not go for 2560x1600 for higher DPI display on high-end laptops.

    Very disappointing,
  • MGSsancho - Tuesday, May 1, 2012 - link

    It would be nice yes but more pixels means more umpth to drive a game with maxed out settings, crazy AA and acceptable frame rates. There is a balance between having a laptop being able to play games maxed out and letting the end user set what he/she thinks is best. still more DPI would be welcomed. I imagine there are plenty of people who purchase these systems not for their gaming ability but perhaps content creation abilities. Who knows for sure but more options is nice to be honest.
  • PCMerlin - Tuesday, May 1, 2012 - link

    I agree 100% - I can't find any logic to explain why I can't even get a display that equals my 5-yr old Gateway laptop. If I can't get at least the same resolution, then why upgrade?

    Take a hint from Apple and the iPad retina display - people really do want the higher resolution and they are willing to pay for it... There's no need to make ALL the options 1920x1200 or higher....at least just make ONE an option.
  • JarredWalton - Tuesday, May 1, 2012 - link

    Logic? How about: "We can sell a display that is slightly cheaper to produce and has a slightly smaller area and a lower resolution, and by marketing it as 1080p we can get interest from the mass consumer market that only knows 1080p goes with Blu-ray. Step 2: profit!"

    You may not like the logic and you may feel that it's causing stagnation in the industry, but to say there's no logic at all would be incorrect. It's all about big business and making money, and the bean counters appear to think that 16:9 is a better way to make money than 16:10. Which really, really sucks and I've been campaigning against such attitudes for years, mostly without a lot of success.
  • seapeople - Sunday, May 6, 2012 - link

    If it's so logical to race to the bottom of the barrel on screen quality and resolution, then why is the most successful and profitable company in the world pushing high resolution and high quality screens?
  • KoolAidMan1 - Tuesday, May 1, 2012 - link

    Performance is something to keep in mind. You get much smoother framerates with those GPUs outputting to a 1920x1080 display than outputting to a 2560x1440 display.
  • Grizzlebee - Tuesday, May 1, 2012 - link

    But, at the same time, 2560x1440 will look better with lower settings.
  • JarredWalton - Tuesday, May 1, 2012 - link

    1920x1080 running a native 1920x1080 will look better than 2560x1440 running at non-native 1920x1080, and to properly run games at high detail settings and QHD resolution, you need roughly the power of a desktop GTX 580/680 or HD 6970/7970. Even the CF 7970M and SLI 675M would struggle to drive QHD (or QWXGA), never mind the fact that DPI scaling would still be a bit of a crap shoot.
  • erple2 - Thursday, May 3, 2012 - link

    True, but eventually you get to a point on an LCD where running "any" resolution still looks good, provided your monitor has the DPI to support it.

    Eventually the resolution density of the LCD is so large that any "reasonable" resolution looks "crisp" on it, even off-resolutions. At least, your eye can't tell the difference. I think that as the DPI of the screen gets sufficiently large, you can actually run the screen at any resolution without visual issues.

    Kind of hearkening back to the CRT days when 640x480 looked just as "crisp" as 1600x1200 on my 17" monitor. Though what might be more useful is to keep track not of each triad of subpixels as a single unit, but each subpixel itself as it's own unit. Perhaps then we'll see "clearer" off-resolution displays...
  • JarredWalton - Friday, May 4, 2012 - link

    Yeah, I'm not sure what level of pixel density I'd want to make the case that everything looks "nearly like native". 2560x1600 (or 1440 if they insist on 16:9) in a 17" or smaller display would probably do it. I'd never run it at native, though -- heck, 2560x1600 at 30" is actually still a bit uncomfortable for text! And I've been using it for years.
  • Denithor - Tuesday, May 1, 2012 - link

    On the M14x R2 you list the following configurations:

    Memory 6/8/12/16GB DDR3-1600 (two SO-DIMMs)

    The only way to get to 6 and 12 in a two-stick array is with mis-matched RAM, ie 2GB + 4GB and 4GB + 8GB. Doesn't this mean they will be running in single-channel memory mode? Why the heck would they even offer an arrangement that would gimp your performance?
  • JarredWalton - Tuesday, May 1, 2012 - link

    It means you're running in Intel's "Flex Memory" mode, where on 6GB the first 4GB is full dual-channel performance and the last 2GB is not, or with 12GB the first 8GB gives you dual-channel performance and the last 4GB does not. Intel's Flex Memory Technology has been around since at least 2004 I believe.
  • Penti - Tuesday, May 1, 2012 - link

    Why go a bit overboard with the ports? The M14x and it's bigger variants has VGA, HDMI and DisplayPort via their own physical ports which seems redundant when they could use the space for other ports and use only Displayport to get everything by that, just ship adapters and you should be fine. It works in the corporate/enterprise market at least. Plus at Apple for that matter. Consumers should really get stuff like WiFi Display / WiFi Direct, Intel WiDi, DLNA DMR and so on. Plus getting a mini-Displayport to HDMI-cable aren't exactly more difficult then getting and bringing your HDMI cable. Would love to actually see Thunderbolt now, also wouldn't be that bad with a eSATA or even Firewire port instead of three video-ports. Odd to raid mSATA plus 2.5" SSDs, I think they should have spent some more time on other aspects of it maybe. Screen is important, lights and useless functions aren't. One mSATA is fast enough as far as anyone is concerned an extra 2.5" should be used for space. Maybe get a MVA-screen in there would be a good trade off. AUO should be pretty happy to make them.There is always wideviewing angle TN or IPS too.

    I guess the GT 650M Nvidia graphics is a pretty good fit for a 1600x900 laptop though. For those looking for a mid-range gaming machine it should deliver. But don't mention the 18 inch version to me I don't get it and I'd rather not see the 17-inch gaming laptops regardless if it's Dell or Clevo/Sager chassis. You can clearly get workstation performance in 15-inch beating entry level to mid stationary workstations. Gaming performance is no problem either as long as you accept some lower settings and there is no need for the most ridiculous machines ever that simply won't be portable. I'm sure you could design something smarter for those wanting a 6 kg gaming machine.
  • Wolfpup - Wednesday, May 2, 2012 - link

    Glad the article explicitly mentions you're waiting to hear back on that.

    I sure hope neither the Nvidia nor AMD options use switchable graphics. If they don't, I'll probably be buying an M17x-R4 next!

    (Although that new Samsung notebook looks like a good deal, also assuming it doesn't use switchable graphics and there are spare PSUs available from Samsung...)
  • seapeople - Sunday, May 6, 2012 - link

    Ok, the obligatory "Why don't you like switchable graphics?" question.
  • high_rez - Friday, May 11, 2012 - link

    Do u know if i have alienware with HM67 can I put Ivy bridge in it?
  • aravenwood - Wednesday, June 20, 2012 - link

    Dustin - do you plan on reviewing the new Alienware Ivy Bridge line in detail? Particularly the new M17x w/Ivy Bridge? I bought the M17xR3 you gave such good reviews to and it's the best laptop, and pretty near the best PC I've ever had. (i would have paid the premium just for the quality keyboard and rubberized finish. I'm not as young as I used to be and a good keyboard makes a world of difference to me) My brother bought the M14 and he's isn't as happy, and is thinking about upgrading to the M17x, but isn't sure whether it's worth the money to get the new M17x or try to find a used Sandy Bridge M17x and save the premium. Or even if there is a better machine than that now available. Thx.

Log in

Don't have an account? Sign up now