What's the deal with the HL2 benchmarks?

Sharky Forums


Page 1 of 2 12 LastLast
Results 1 to 15 of 19

Thread: What's the deal with the HL2 benchmarks?

  1. #1
    Tiger Shark Seoulstriker's Avatar
    Join Date
    Dec 2003
    Posts
    976

    What's the deal with the HL2 benchmarks?




    http://www.extremetech.com/article2/...1264303,00.asp

    this is what was written in the conclusion:

    nVidia has been circulating its Det50 driver to analysts in hopes that we would use it for our Half-Life 2 benchmarking. The driver contains application-specific optimizations that will likely improve nVidia's overall performance picture, however Valve's Gabe Newell expressed concerns that some of nVidia's optimizations may go too far. Doug Lombardi of Valve has explicitly asked that beta versions of the Det50 drivers not be used for benchmarking.
    WHY ON EARTH WOULD THE GUY FROM VALVE SAY THAT THE DET50 DRIVERS CAN'T BE USED FOR BENCHMARKING???


    so Valve, with the bribe from ATi, purposely used nvidia drivers which were not optimized for DirectX 9.0 and valve purposely left HL2 unoptomized by not using the unified compiler provided by nvidia?

    all of this with ATi bundling HL2 with every ATi card? what's the deal with this?

  2. #2
    Mako Shark Paladyr's Avatar
    Join Date
    Mar 2001
    Location
    Cinci, OH
    Posts
    4,562
    hahahahahahhahahhaahahahaha

    U r kidding right?? Have you been living under a rock? If the Det50s aren't compatible with DX9 then they shouldn't be used for a DX9 benchmark, period.

    Since it is a beta driver, they could be using some serious cheats like they (nvidia) were with 3dmark (eliminating parts of the scene not being rendered on the screen to increase FPS and sacrifice the integrity of the benchmark).
    Last edited by Paladyr; 01-27-2004 at 04:05 PM.
    Core 2 Duo w Radeon 4850, Droid Phone, Mits HD1000U projector

  3. #3
    Tiger Shark Seoulstriker's Avatar
    Join Date
    Dec 2003
    Posts
    976
    i'll pardon your ignorance:

    During the entire development of Half Life 2, NVIDIA has had close technical contact with Valve regarding the game. However, Valve has not made us aware of the issues Gabe discussed.

    We're confused as to why Valve chose to use Release. 45 (Rel. 45) - because up to two weeks prior to the Shader Day we had been working closely with Valve to ensure that Release 50 (Rel. 50) provides the best experience possible on NVIDIA hardware.

    Regarding the Half Life2 performance numbers that were published on the web, we believe these performance numbers are invalid because they do not use our Rel. 50 drivers. Engineering efforts on our Rel. 45 drivers stopped months ago in anticipation of Rel. 50. NVIDIA's optimizations for Half-Life 2 and other new games are included in our Rel.50 drivers - which reviewers currently have a beta version of today. Rel. 50 is the best driver we've ever built - it includes significant optimizations for the highly-programmable GeForce FX architecture and includes feature and performance benefits for over 100 million NVIDIA GPU customers.

    Pending detailed information from Valve, we are unaware of any issues with Rel. 50 and the drop of Half-Life 2 that we have. The drop of Half-Life 2 that we currently have is more than 2 weeks old. It is not a cheat or an over optimization. Our current drop of Half-Life 2 is more than 2 weeks old. NVIDIA's Rel. 50 driver will be public before the game is available. Since we know that obtaining the best pixel shader performance from the GeForce FX GPUs currently requires some specialized work, our developer technology team works very closely with game developers. Part of this is understanding that in many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9) provides no image quality benefit. Sometimes this involves converting 32-bit floating point precision shader operations into 16-bit floating point precision shaders in order to obtain the performance benefit of this mode with no image quality degradation. Our goal is to provide our consumers the best experience possible, and that means games must both look and run great.

    The optimal code path for ATI and NVIDIA GPUs is different - so trying to test them with the same code path will always disadvantage one or the other. The default settings for each game have been chosen by both the developers and NVIDIA in order to produce the best results for our consumers.

    In addition to the developer efforts, our driver team has developed a next-generation automatic shader optimizer that vastly improves GeForce FX pixel shader performance across the board. The fruits of these efforts will be seen in our Rel.50 driver release. Many other improvements have also been included in Rel.50, and these were all created either in response to, or in anticipation of the first wave of shipping DirectX 9 titles, such as Half-Life 2.

    We are committed to working with Gabe to fully understand.

  4. #4
    Great White Shark mikeysg's Avatar
    Join Date
    May 2003
    Posts
    8,740
    Guys, this HL2 issue has been beaten to death already, just ignore this and let this thread die!
    Save Ana
    http://www.jpages.net/HelpAna.htm

    Main Rig - All AMD build: R9 3900X | GB X570 Aorus Xtreme | 2x 8GB 3600CL17 Patriot ViperRGB | Sapphire Nitro+ RX 6900 XT | 256GB Sabrent Rocket NVMe M.2 (OS) | 4TB Samsung 860 EVO SATA SSD | 2TB Samsung QVO SATA SSD | 2x 1TB Samsung 850 EVO | 6TB WD Black | Corsair HX 1000 Platinum | CM H500M | Samsung LC49HG90DMEXX 32:9 | iFi Micro iDSD BL | LG SL8YG Soundbar | 64bit Win10 Pro

  5. #5
    Tiger Shark Seoulstriker's Avatar
    Join Date
    Dec 2003
    Posts
    976
    has the issue of valve purposely making nvidia's hardware suck so that ATi sells well been discussed?

  6. #6
    Mako Shark Paladyr's Avatar
    Join Date
    Mar 2001
    Location
    Cinci, OH
    Posts
    4,562
    Were they using beta ATI drivers? Just because nVidia says they are the best drivers doesn't mean they aren't cheating. What good is a benchmark if ATI/nVidia are allowed to cheat??? It won't tell you about how the card will actually perform using the game if they are cheating.
    Core 2 Duo w Radeon 4850, Droid Phone, Mits HD1000U projector

  7. #7
    Great White Shark lonewolfroger's Avatar
    Join Date
    Aug 2002
    Location
    Texas
    Posts
    7,214
    Originally posted by Seoulstriker
    has the issue of valve purposely making nvidia's hardware suck so that ATi sells well been discussed?
    Yes, this was discussed a while back, to great extent.
    i7 920 @3GH, P6T Deluxe V2, OCZ HP 6GB(3x2)Kit(7-7-7-16), 128GB Patriot SSD, 250GB SATA(7200)Hitachi, WD 500gb, 1000W BFG PSU, SATA Pioneer DVD CD burner, XFX5870 XXX edition(900/1300), LG LED 2350V, XFI Titanium Fatal1ty champion, Logitech X-540, Lian Li PC-B70 Full Tower Case, Win7 HP.

  8. #8
    Mako Shark Paladyr's Avatar
    Join Date
    Mar 2001
    Location
    Cinci, OH
    Posts
    4,562
    Originally posted by Seoulstriker
    has the issue of valve purposely making nvidia's hardware suck so that ATi sells well been discussed?
    obtuse.

    Also, the det 50s will still be released, and then we'll see the performance. Who cares how that makes nVidia look right now. They've already done a nice job making themselves look unethical!!! Do you have strong personal feelings for nVidia?? If the Det 50s are as good as they say they are, and there isn't any cheating going on, they will be used for benchmarks and nVidia will get whatever results they get with det 50s. What's the big deal?? If I were a reviewer and nVidia or ATI came into my office and said:

    "Quick, use these drivers instead of our latest publically released ones because they are uber sweet" I would say no because I haven't had a chance to evaluate whether or not any cheats were being used int he drivers which would sacrifice the integrity of the benchmark.
    Last edited by Paladyr; 01-27-2004 at 04:35 PM.
    Core 2 Duo w Radeon 4850, Droid Phone, Mits HD1000U projector

  9. #9
    Great White Shark Un4given's Avatar
    Join Date
    Oct 2000
    Location
    Salt Lake City, UT United States
    Posts
    22,555
    Originally posted by lonewolfroger
    Yes, this was discussed a while back, to great extent.
    No doubt. That horse gotten beaten into glue.
    Prince of the OC Crusaders

    Intel i7 3.2GHz @ 4.24GHz
    Cooler Master V8
    Asus P9X79 Pro
    16GB Patriot Viper Extreme DDR3-1600 (quad channel)
    HIS R9 290X @1050MHz
    Asus 20x DVD-RW DL DVD-RW

  10. #10
    Mako Shark Paladyr's Avatar
    Join Date
    Mar 2001
    Location
    Cinci, OH
    Posts
    4,562
    Originally posted by Un4given
    No doubt. That horse gotten beaten into glue.
    I must've missed it, but I can imagine it was :-P. Last I heard though, nVidia still needs to run in Mixed mode for acceptable performance and that isn't DX9 compliant.
    Core 2 Duo w Radeon 4850, Droid Phone, Mits HD1000U projector

  11. #11
    Hammerhead Shark Dimitris's Avatar
    Join Date
    Oct 2001
    Location
    London,UK
    Posts
    1,050
    Originally posted by Paladyr
    I must've missed it, but I can imagine it was :-P. Last I heard though, nVidia still needs to run in Mixed mode for acceptable performance and that isn't DX9 compliant.
    Do you care about pre-release benchmarks or do you judge a card without the game even being released? It's well known that Nvidia didn't follow completely the dx9 spec table. They are trying very hard to make their cards as competable as possible. What matters is what drivers we use when we play the game. And afaik, HL2 will not be out until 4th quarter2004. By then, new gen cards will alreayd be out and no one can guess which one is better. So instead of bringnig up history, try to dig some info about the nv40 and r420. And as for hl2, forget it, get doom3.
    Rig1:
    Pentium 4C 3.0@3.53
    Abit IC7-G MAX2
    Dual Challenel 1024mb TWINMOS 3700
    Geforce FX5950@ 594/1065
    Nec 2500 DVD-RW
    Fortissmo III
    Creative Megaworks 5.1

    Rig 2
    AMD 2500XP+ @(waiting result)
    Epox 8RDA+ rev 2.x
    Corsair 3500 rev1.1 BH5
    Geforce FX5800@ 530/1030
    Sony 52 CD-RW
    SB AUDIGY

  12. #12
    Hammerhead Shark xelaeel's Avatar
    Join Date
    Feb 2002
    Location
    Toronto
    Posts
    2,232
    I missed it too. Anyone has link to that kinda thread?
    XELA!

    Athlon XP2400+ @ 2343Mhz (11x213)
    SLK 947U w/92mm Tornado w/Nexus 205
    Abit NF7-S Revision 2.0
    OCZ PC-3200 Platinum 2x512MB @ 426Mhz
    ATI Radeon X850XT
    WD 60GB | WD 40GB | WD 160GB SATA
    Antec Plusview 1000AMG | Thermaltake PurePower 420W Active PFC
    Windows XP Pro SP2
    Creative Sound Blaster Audigy 2 ZS | Creative MegaWorks 550 THX


    Laptop 1: Dell XPS M1710 | T2500 | 2GB RAM | GeForce Go 7900 GTX | Hitachi 100GB 7200RPM
    Laptop 2: MacBook | T7400 | 1GB RAM | 120GB

  13. #13
    Great White Shark Un4given's Avatar
    Join Date
    Oct 2000
    Location
    Salt Lake City, UT United States
    Posts
    22,555
    Originally posted by Dimitris
    Do you care about pre-release benchmarks or do you judge a card without the game even being released? It's well known that Nvidia didn't follow completely the dx9 spec table. They are trying very hard to make their cards as competable as possible. What matters is what drivers we use when we play the game. And afaik, HL2 will not be out until 4th quarter2004. By then, new gen cards will alreayd be out and no one can guess which one is better. So instead of bringnig up history, try to dig some info about the nv40 and r420. And as for hl2, forget it, get doom3.
    Pffft, Doom III is going to another engine display for ID. I've seen the in the game movie cuts and I'm not all that impressed with what I've seen. MP is what most people look for these days, and Doom III isn't going to cut it with the extremely limited MP game play it has.
    Prince of the OC Crusaders

    Intel i7 3.2GHz @ 4.24GHz
    Cooler Master V8
    Asus P9X79 Pro
    16GB Patriot Viper Extreme DDR3-1600 (quad channel)
    HIS R9 290X @1050MHz
    Asus 20x DVD-RW DL DVD-RW

  14. #14
    Hammerhead Shark icecube_of_death's Avatar
    Join Date
    Jun 2003
    Location
    Toronto
    Posts
    2,049

    Re: What's the deal with the HL2 benchmarks?

    Originally posted by Seoulstriker
    valve purposely left HL2 unoptomized by not using the unified compiler provided by nvidia?
    Blame the GeForce FX delay and the 16/32bit precision on nv cards. It's not valve nor ATi's fault, it was nv who broke the rules.
    [SIZE=1.5]XP 1700+ @ 2.3 Ghz (219 x 10.5) 1.63v | Abit NF7-S rev. 2 d22 Alpha 1 Turbo | 512Mb PC2700 DDR333 | Thermalright SK-7 w/ Smart Fan II | Maxtor 6E030L0 30Gb | MSI GeForce4 Ti4200 128Mb 315/560 | Thermaltake Giant II | LG 40/12/40 | WinXP SP1

    Sharky Extreme 3DMark Team | Alpha Dir icecube_of_death of the Cooler's Guild | Sharky Folding @ Home Team

    3DMark can kiss my ***
    [/SIZE]

  15. #15
    Mako Shark Paladyr's Avatar
    Join Date
    Mar 2001
    Location
    Cinci, OH
    Posts
    4,562
    Originally posted by Dimitris
    Do you care about pre-release benchmarks or do you judge a card without the game even being released? It's well known that Nvidia didn't follow completely the dx9 spec table. They are trying very hard to make their cards as competable as possible. What matters is what drivers we use when we play the game. And afaik, HL2 will not be out until 4th quarter2004. By then, new gen cards will alreayd be out and no one can guess which one is better. So instead of bringnig up history, try to dig some info about the nv40 and r420. And as for hl2, forget it, get doom3.
    I do care about pre-release benchmarks, as long as no cheating is involved. I don't put too much stock in them however.

    That's great that nv40 and r420 will be out by then, but that is not what we are discussing.

    I too have far higher hopes for HL2 than Doom III.
    Core 2 Duo w Radeon 4850, Droid Phone, Mits HD1000U projector

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •