Welcome to THE BLACK DUCK
Please sign in Here


warning  Welcome to , You are not logged in. If you have not registered yet, please click here. Alternatively log into your account now.

 

THE BLACK DUCK: Forums

 


Post new topic   Reply to topic    The Black Duck Forum Index -> PC - All things Electronic
Nvidia Kepler - 2012 news
View next topic
View previous topic
Post new topic   Reply to topic  The Black Duck Forum Index PC - All things Electronic
Author Message
stufz
godly
godly

Joined: Oct 02, 2011
Posts: 1734
Location: sasnakia
PostPosted: Mon Dec 12, 2011 5:35 pm Reply with quote

12.12.11

Nvidia's Next-Gen Kepler Graphics Chips to Support DirectX 11.1


[11/28/2011 11:32 PM]
by Anton Shilov


Nvidia Corp.'s next-generation Kepler architecture for graphics processing units (GPUs) and compute accelerators promises a lot with its new levels of performance and new set of DirectX 11.1 capabilities. However, it will take Nvidia almost a year to fully roll-out the Kepler family, according to a newly published information.

In a bid to avoid a situation with massive delay of the new family, Nvidia will start with introduction of relatively simplistic products, code-named GK107 and GK106, according to information published by 4Gamer.net web-site. Although the GK107 (128-bit memory) will support DirectX 11.1, unlike the GK106 (256-bit memory bus) it will not feature PCI Express 3.0. The more powerful GK104 will feature PCIe 3.0 and 384-bit bus, whereas the GK110 is projected to carry two of such chips. Both GK104 and GK110 will be available later than the less advanced parts. The most advanced Kepler-family chip will be code-named GK112. The product is projected to feature 512-bit memory bus. The flagship single chip solution will be the last in the Kepler 1.0 family and will be presumably released towards the end of 2012.

All Kepler-generation of chips will be made using 28nm process technology at Taiwan Semiconductor Manufacturing Company. Thanks to new fabrication process, the Kepler products have proved to be very efficient and competitive in the mobile space, according to Nvidia. The decision to address mobile computers first partly forced Nvidia to concentrate on development of entry-level solutions first.

Image


Kepler is Nvidia's next-generation graphics processor architecture that is projected to bring considerable performance improvements and will likely make the GPU more flexible in terms of programmability, which will speed up development of applications that take advantage of GPGPU (general purpose processing on GPU) technologies. Some of the technologies that Nvidia promised to introduce in Kepler and Maxwell (the architecture that will succeed Kepler) include virtual memory space (which will allow CPUs and GPUs to use the "unified" virtual memory), pre-emption, enhance the ability of GPU to autonomously process the data without the help of CPU and so on. Entry-level chips may not get all the features that Kepler architecture will have to often.

SOURCE

_________________
Image
View user's profile Send private message Send e-mail Visit poster's website
stufz
godly
godly

Joined: Oct 02, 2011
Posts: 1734
Location: sasnakia
PostPosted: Mon Jan 16, 2012 8:29 pm Reply with quote

Nvidia Plans to Launch Next-Gen Kepler GPUs in March or April


Nvidia Wants to Ensure Availability of Kepler at Launch


[01/15/2012 09:11 PM]
by Anton Shilov


Nvidia Corp. has learnt its lesson when it unveiled code-named Fermi architecture about half a year ahead of actual release and would like to ensure instant availability of next-generation graphics cards after the formal launch, according to unofficial information. At present the company considers to schedule the launch of code-named Kepler products on March or April.

Nvidia, a leading designer of various multimedia processors, did not show any new graphics cards at the Consumer Electronics Show last week and even did not make any comments regarding its next-gen Kepler graphics processing unit (GPU) products. Instead, the company focused on demonstration of products running Nvidia Tegra 3 system-on-chips, including tablets and even cars. Although the market is clearly shifting towards portable and ultra-portable computing, standalone GPUs remain the most important business for Nvidia and the lack of any mention about Kepler was somewhat surprising.

According to information from VR-Zone web-site, Nvidia's new plan dictates launching the GeForce Kepler when the company is able to insure hardware availability and have all of its partners covered. While the approach may not be the best from advertising and marketing points of view, it is clear that instant availability allows Nvidia to sell more hardware since there will be loads of emotional pre-orders online after reading product reviews, whereas waiting kills emotions.

Kepler is Nvidia's next-generation graphics processor architecture that is projected to bring considerable performance improvements and will likely make the GPU more flexible in terms of programmability, which will speed up development of applications that take advantage of GPGPU (general purpose processing on GPU) technologies. Some of the technologies that Nvidia promised to introduce in Kepler and Maxwell (the architecture that will succeed Kepler) include virtual memory space (which will allow CPUs and GPUs to use the "unified" virtual memory), pre-emption, enhance the ability of GPU to autonomously process the data without the help of CPU and so on. Entry-level chips may not get all the features that Kepler architecture will have to often. Production of Kepler chips was supposed to start in Q4 2011.

SOURCE


Interesting comment posted to this article posted by one BestJinjo :

1. GTX470/480 were launched around March 26, 2010. That's almost 2 years ago, not last year. Your time frame is way off.

2. Despite launching 6 months late, NV still managed to have a ~ 60% market share on the desktop. This in itself highlights the fact that launching first even by 6 months doesn't guarantee success, especially since the market for $549+ graphics cards is very small (~2%). Unlike HD5870/5850 cards that were launched during Q4 2009 (pre-holiday season), HD7970 launches into a slow Q1 2012, following Q4 2011 that saw PC sales fall 6%:

http://www.techspot.com/n...lidays-apple-grew-18.html

So add extremely high price of $550 that attracts 2-3% of buyers, add lack of sufficient stock of HD7970 cards on retail, HD7950 delayed more than a month, and the head start this time is even less of an advantage than it was last time because HD5850/5870 were priced very aggressively and launched in a quarter where a lot of gamers bought hardware for Xmas/etc.

3. Launching first is effective only IF you are able to deliver an entire new product line, including lower priced cards such as $99-$349. This is because 85% of desktop discrete GPUs are sold in the < $349 price bracket. Yet, so far AMD only launched 1 high-end HD7000 series card on the desktop. It will take them another 1-2 months to get their entire product line out. At this time, NV still has very competitive product line-up under <349>580-->7970 (i.e., drop $500+ on GPUs) are excited about 7970/GTX680 style cards.

What we need are MUCH faster mid-high end cards such as HD6950/GTX560Ti and we might have to wait until 2013 before that happens.

_________________
Image
View user's profile Send private message Send e-mail Visit poster's website
stufz
godly
godly

Joined: Oct 02, 2011
Posts: 1734
Location: sasnakia
PostPosted: Mon Feb 27, 2012 1:08 am Reply with quote

aight ! this is more understandable >..

Taiwanese AIB partners have revealed to Sweclockers that Nvidia would be launching its 28nm Kepler GK104 chip based GeForce GTX 670Ti GPU in March.


Rumors of a Kepler GPU based on the GK104 core were already circulating the web, The GeForce GTX670Ti which is the successor to the GF-114 based GeForce GTX560Ti would be a performance tier arsenal from Nvidia against the AMD’s recently released HD7000 Series cards with performance levels better than Geforce GTX 580 and Radeon HD 7950.

A few specifications of the GK104 were also leaked earlier which show that the chip would feature 1536 Cuda Cores, 128 TMU, 32 ROPs and 2 GB 256-bit memory GDDR5, Core clock would be maintained around 950Mhz-1050Mhz. TDP ranges around 225W while retail price is expected around $299. Another GK104 product codenamed GTX680 was unveiled a while back which is expected to be faster than the HD7970 GPU, Details here.

On the other hand, GK110 which is the flagship chip and replaces the GF-110 based GTX580 would arrive in Q3/Q4 2012, Lower end models based on the GK107 chip are also expected in March – April 2012. You can get further details on Nvidia Kepler GPU’s below:

NVIDIA Expected to launch Eight New 28nm Kepler GPU’s in April 2012

Alleged Nvidia Kepler GK104 Specs Exposed – GPU to Feature 1536 Cuda Cores, No Hot-Clocks and comes in Two Variants

NVIDIA Kepler GK104 Gaming Performance Figures Exposed, Faster than GTX580 and HD7900 Series

NVIDIA Kepler GK104 GeForce GTX680 Architecture Pictured – Specifications Detailed

SOURCE

_________________
Image

Last edited by stufz on Fri Mar 23, 2012 2:48 pm; edited 1 time in total
View user's profile Send private message Send e-mail Visit poster's website
stufz
godly
godly

Joined: Oct 02, 2011
Posts: 1734
Location: sasnakia
PostPosted: Thu Mar 22, 2012 9:28 am Reply with quote

Its on the street ! today !

PRICE - $499.99

EVGA 02G-P4-2680-KR GeForce GTX 680 2GB 256-bit GDDR5 PCI Express 3.0 x16 HDCP Ready SLI Support Video Card

Chipset
----------
Core Clock - 1006MHz
Stream Processors - 1536 Processor Cores


Memory
----------
Effective Memory Clock - 6000MHz
Memory Size - 2GB
Memory Interface - 256-bit
Memory Type - GDDR5

Min. of 550 Watts PSU with a +12 Volt current rating of 38 Amps

2 x 6 Pin
Card Size - 10" x 4.38"

SOURCE

_________________
Image
View user's profile Send private message Send e-mail Visit poster's website
Evil_Elf
godly
godly

Joined: Nov 08, 2011
Posts: 319

PostPosted: Thu Mar 22, 2012 11:00 am Reply with quote

$500 is not bad.. I thought they would be more in the $700 range


check out the EVGA cards
The EVGA GeForce GTX 680 comes stock ready to water cool

http://wccftech.com/evga-announces-evga-geforce-gtx-680-evga-precision/

Image

_________________
Image
View user's profile Send private message
stufz
godly
godly

Joined: Oct 02, 2011
Posts: 1734
Location: sasnakia
PostPosted: Fri Mar 23, 2012 1:47 pm Reply with quote

Review is up at Toms Hardware

http://www.tomshardware.com/reviews/geforce-gtx-680-review-benchmark,3161.html


Nvidia is fond of drawing parallels. With its prior-generation graphics cards, the company likened each model to a different role on a virtual battlefield. GeForce GTX 480 was the tank—big performance and, to a fault, a big price, big power, and big heat, as well. GeForce GTX 460 came to be referred to as the hunter, incorporating a better balance of speed, efficiency, and cost more apropos to gamers. Finally, GeForce GTS 450 was dubbed the sniper for its focus on enabling playable frame rates at 1680x1050, according to Nvidia.

As silly as that trio of categories seemed, they make it easier for us to put a finger on the pulse of GeForce GTX 680. Though its name (and price) suggests a successor to Nvidia’s current single-GPU flagship, this is decidedly the hunter—a gamer-oriented card that almost completely de-emphasizes the once-emphatic message of improving general-purpose compute performance. But hey, it does that whole gamer thing really well, just like the GeForce GTX 460.

AND


The rest of the dual-slot bracket plays host to four display outputs: two dual-link DVI connectors, one full-sized HDMI port, and a DisplayPort output. All of them are useable concurrently, addressing one of our sharpest critiques of all prior-gen Fermi-based boards. At long last, we can consider multi-monitor gaming tests to replace 2560x1600 in our single-card reviews (and indeed, multi-monitor benchmarks will follow in a story on which we're already working)! Like AMD, Nvidia claims that this card support HDMI 1.4a, monitors with 4K horizontal resolutions, and multi-stream audio.

AND


Why quadruple the number of CUDA cores and double the other resources? Kepler’s shaders run at the processor’s frequency (1:1). Previous-generation architectures (everything since G80, that is) operated the shaders two times faster than the core (2:1). Thus, doubling shader throughput at a given clock rate requires four times as many cores running at half-speed.

The question then becomes: Why on earth would Nvidia throttle back its shader clock in the first place? It’s all about the delicate balance of performance, power, and die space, baby. Fermi allowed Nvidia’s architects to optimize for area. Fewer cores take up less space, after all. But running them twice as fast required much higher clock power. Kepler, on the other hand, is tuned for efficiency. Halving the shader clock slashes power consumption. However, comparable performance necessitates two times as many data paths. The result is that Kepler trades off die size for some reduction in power on the logic side, and more significant savings from clocking.

AND


The implementation of GPU Boost does not preclude overclocking. But because you can’t disable GPU Boost like you might with Intel’s Turbo Boost, you have to operate within the technology’s parameters.

For example, overclocking is now achieved through an offset. You can easily push the base 3D clock up 100, 150, or even 200 MHz. However, if a game was already TDP-constrained at the default clock, it won’t run any faster. In apps that weren’t hitting the GTX 680’s thermal limit before, the offset pushes the performance curve closer to the ceiling.

Because GPU Boost was designed to balance clock rate and voltage with thermal design power in mind, though, overclocking is really made most effective by adjusting the board’s power target upward as well. EVGA’s Precision X tweaking tool includes built-in sliders for both the power target and the GPU clock offset.

Although GeForce GTX 680’s TDP is 195 W, Nvidia says the card’s typical board power is closer to 170 W. So, increasing the power slider actually moves this number higher. At +32%, Precision X’s highest setting is designed to get you right up to the 225 W limit of what two six-pin power connectors and a PCI Express slot are specified to deliver.

Our attempt to push a 200 MHz offset demonstrates that, even though this technology tries to keep you at the highest frequency under a given power ceiling, increasing both limits still makes it easy to exceed the board’s potential and seize up.

Sliding back a bit to a 150 MHz offset gives us stability, but performance isn’t any better than the 100 MHz setting. No doubt, it’ll take more tinkering to find the right overclock with GPU Boost in the mix and always on.

AND


GeForce GTX 680 includes a 16-lane PCI Express interface, just like almost every other graphics card we’ve reviewed in the last seven or so years. However, it’s one of the first boards with third-gen support. All six Radeon HD 7000 family members preempt the GeForce GTX 680 in this regard. But we already know that, in today’s games, doubling the data rate of a bus that isn’t currently saturated doesn’t impact performance very much.

Nevertheless PCI Express 3.0 support becomes a more important discussion point here because Nvidia’s press driver doesn’t enable it on X79-based platforms. The company’s official stance is that the card is gen-three-capable, but that X79 Express is only validated for second-gen data rates. Drop it into an Ivy Bridge-based system, though, and it should immediately enable 8 GT/s transfer speeds.

Nvidia sent us an updated driver to prove that GeForce GTX 680 does work, and indeed, data transfer bandwidth shot up to almost 12 GB/s. Should Nvidia validate GTX 680 on X79, a new driver should be the answer. In contrast, the data bandwidth of AMD’s Radeon HD 7900s slides back from what we’ve seen in previous reviews. Neither AMD nor Gigabyte is able to explain why this is happening.

AND


Nvidia’s solution to the pitfalls of running with v-sync on or off is called adaptive v-sync. Basically, any time your card pushes more than 60 FPS, v-sync remains enabled. When the frame rate drops below that barrier, v-sync is turned off to prevent stuttering. The 300.99 driver provided with press boards enables adaptive V-sync through a drop-down menu that also contains settings for turning v-sync on or off.

Given limited time for testing, I was only really able to play a handful of games with and without v-sync, and then using adaptive v-sync. The tearing effect with v-sync turned off is the most distracting artifact. I’m less bothered when v-sync is on. Though, to be honest, it takes a title like Crysis 2 at Ultra quality to bounce above and below 60 FPS with any regularity on a GeForce GTX 680.

_________________
Image

Last edited by stufz on Fri Mar 23, 2012 4:37 pm; edited 3 times in total
View user's profile Send private message Send e-mail Visit poster's website
stufz
godly
godly

Joined: Oct 02, 2011
Posts: 1734
Location: sasnakia
PostPosted: Fri Mar 23, 2012 3:08 pm Reply with quote

03.22.12

EVGA today announced the launch of its new GeForce GTX 680 lineup consisting of five models which include EVGA SuperClocked, EVGA FTW, EVGA Classified and EVGA Hydro Copper GTX 680 variants.

SOURCE

8+6 Pin/8+8 Pin Power Connectors
Available on EVGA GeForce GTX 680 Superclocked Signature, FTW, Classified and Hydro Copper models, these connectors can provide up to 225/375 watts, delivering better overclock stability.

Read all about the GTX 680 line-up and their features HERE

_________________
Image
View user's profile Send private message Send e-mail Visit poster's website
Display posts from previous:       

Post new topic   Reply to topic  The Black Duck Forum Index PC - All things Electronic


 Jump to:   




View next topic
View previous topic
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum
You cannot attach files in this forum
You cannot download files in this forum

Theme design by PHP Nuke Clan Themes © 2011
Forums ©

All logos and trademarks in this site are property of their respective owner. The comments are property of their posters, all the rest © 2011 by me.

You can syndicate our news using the file backend.php.

Distributed by Raven PHP Scripts
New code written and maintained by the RavenNuke™ TEAM