Hi there! You are currently browsing as a guest. Why not create an account? Then you get less ads, can thank creators, post feedback, keep a list of your favourites, and more!
Lab Assistant
Original Poster
#1 Old 19th Sep 2019 at 1:56 AM

This user has the following games installed:

Sims 3, World Adventures, Ambitions, Late Night, Generations, Pets, Showtime, Supernatural, Seasons, University Life, Island Paradise, Into the Future
game runs with the integrated or with the switchable?
Hi, speaking in another discussion, I had a doubt about which graphics card the game used, I'm going to see the deviceConfig file and I find this:
=== Graphics device info ===
Number: 0
Name (driver): AMD Radeon(TM) R5 Graphics
Name (database): AMD Radeon HD 8500 Series [Found: 1, Matched: 1]
Vendor: ATI
Chipset: Vendor: 1002, Device: 98e4, Board: 13d01043, Chipset: 00c1
Driver: aticfx32.dll, Version: 26.20.11015.5009, GUID: D7B71EE2-DBA4-11CF-6260-D7187AC2D735

I do not understand + if the game runs with the integrated, the AMD Radeon (TM) R5 Graphics or with the switchable, the AMD Radeon HD 8500 Series ...
sorry x the domada banale, but of these things are not practical ...

I also read the http://modthesims.info/t/481768 dual card forum guide, but it explains how to do it for switchable NVIDIA, but with Radeon it's different ...
And from the setting of Radeon, the game is marked on integrated and on the switchable, and I can't seem to enable it only with the switchable

I forgot, in the radeon setting marks both The sims 3 and Supernatural, which already seems strange to me, but then the last of my esp is not isla? why supernatural ?
Advertisement
Lab Assistant
Original Poster
#3 Old 19th Sep 2019 at 11:06 PM
Quote: Originally posted by nitromon
get GPU-Z

Sorry I'm not very practical, what is it?
Lab Assistant
Original Poster
#5 Old 20th Sep 2019 at 5:11 PM
Quote: Originally posted by nitromon
Oh, it is a terrific little program that tells you everything you need to know about your GPU. It also monitors your GPU usage in real time. So run the small program while you play Sims 3. It lets you choose which GPU to monitor, so the GPU with "activities" is the one that is working and what Sims 3 is running on.

The most important reason to use GPU-Z is that it monitors your GPU temperature, so you know if you are overheating.

https://www.techpowerup.com/download/techpowerup-gpu-z/

Thank you very much, I definitely try it tonight
In fact it seems to overheat a little
Scholar
#6 Old 20th Sep 2019 at 6:20 PM
btw does not they have the same drivers?


favorite quote: "When ElaineNualla is posting..I always read..Nutella. I am sorry" by Rosebine
self-claimed "lower-spec simmer"
Lab Assistant
Original Poster
#7 Old 20th Sep 2019 at 9:51 PM
Quote: Originally posted by ElaineNualla
btw does not they have the same drivers?


Hi, I have to look, I changed the PC recently, and I am not yet familiar with PCs with two graphics cards
Scholar
#8 Old 20th Sep 2019 at 10:39 PM
Just asking - I'm under strange, unexplainable impression that these devices are basically the same (like NVidia 250M and GT 840) just the R5 is packed with CPU in newer technology and without memory, obviously. They both should (?) provide similar performance in dual channel RAM scenario, with superior R5 power management (obviously) and better internal communication with CPU.

Or it's just that Area 51 influence? Damn...


favorite quote: "When ElaineNualla is posting..I always read..Nutella. I am sorry" by Rosebine
self-claimed "lower-spec simmer"
Lab Assistant
Original Poster
#9 Old 21st Sep 2019 at 4:45 PM Last edited by Zamira : 21st Sep 2019 at 5:00 PM.
Quote: Originally posted by ElaineNualla
Just asking - I'm under strange, unexplainable impression that these devices are basically the same (like NVidia 250M and GT 840) just the R5 is packed with CPU in newer technology and without memory, obviously. They both should (?) provide similar performance in dual channel RAM scenario, with superior R5 power management (obviously) and better internal communication with CPU.

Or it's just that Area 51 influence? Damn...

Hi, I don't know, it seems to me that they are different, I am attaching the data of GPU-Z so you tell me, since you are more experienced
ADM Radeon(TM) R5 Graphics is the integrated
ADM RADEON HD 8500M is the switchable
Screenshots
Lab Assistant
Original Poster
#10 Old 21st Sep 2019 at 4:49 PM
Quote: Originally posted by nitromon
Oh, it is a terrific little program that tells you everything you need to know about your GPU. It also monitors your GPU usage in real time. So run the small program while you play Sims 3. It lets you choose which GPU to monitor, so the GPU with "activities" is the one that is working and what Sims 3 is running on.

The most important reason to use GPU-Z is that it monitors your GPU temperature, so you know if you are overheating.

https://www.techpowerup.com/download/techpowerup-gpu-z/

I tried the program, and seeing that the integrated GPU is always at 30 degrees, I checked and discovered with Task Manager, that even doing nothing, the PC has many processes in the background (55)
and 97 windows processes seem to me too many ....
  and I have already reduced by disabling useless things
Lab Assistant
Original Poster
#12 Old 22nd Sep 2019 at 5:27 PM
Quote: Originally posted by nitromon
Are you playing in window mode or full screen?

If you are playing in window mode, you can see the activity in real time. If you are playing in full screen and need to alt-tab out to see the sensor, then you can see the history. Check both GPU's sensor data on temperature, load, etc...

They both should have activities (I'm assuming laptop?), just one of them a lot more. I assume AMD runs the same optimus as Nvidia, which the dedicated runs through the integrated.

So for example on my GPUs right now Running TS3 in Window mode:

Nvidia:
Core clock - 835Mhz
Memory clock - 1000.4MHz
GPU Temp - 72*C
GPU Load - 47*
memory - 589 MB

Intel HD:
Core clock - 350 MHz (So you can see it is in minimal. When gaming on Intel HD, the clock goes to 1250 MHz)
memory clock - 800 Mhz
GPU Temperature - 74*C (Since it is integrated, it runs with the CPU)
GPU load - 21% (Integrated is always running, but again in gaming on this Intel HD, it would be at near 99%)
Memory - 245 MB (a lot less than gaming)

So you see both have activity, but one is severely less.

If your GPU is at 30*C, it means it is idling (I'm assuming this is laptop?). It also means this is probably your dedicated and you are not running it. Tell you the truth? Looking at your 2 screenshots, I can't tell which one is the dedicated.

The 8500M seems to have better pixel and shader, faster clock, also larger memory pool, but runs on DDR3, which is older

the R5 is running DDR4 (newer), better bandwidth and buswidth and is newer.

Hi, it's a laptop I use;
Usually I play full screen, I'll try with window mode;
yesterday, while the game was running, I left the full screen mode, and with GUP Z and another program I checked the graphics cards:
From the other program, it seems that both work;
While I had a different result from GUP Z, which only marks a card at 35 degrees;
I am attaching the screen of the other program:
Tonight I will try to do as you said and then I inform you, thanks for the assistance
I forgot, what is crossfire?
Screenshots
Scholar
#13 Old 22nd Sep 2019 at 5:55 PM
yeah, they use the same driver as I thought (and the difference between them is negligible in RL scenarios, yes the 8500 have bigger unit count and texture filter rate - not so important nowadays, but memory access - which is crucial, is seriously superior with R5). They both should perform at similar rate in scenarios where theirs performance is countable (these are low-med end GPUs anyway, for today's standards).

You should have an AMD tool installed which - as its NVidia counterpart - should give you ability to select particular GPU for the program. Probably.
Tried somewhat as this: [ https://www.amd.com/en/support/kb/faq/dh-017 ]? In W10 these setting should be buried somewhere inside sub-mess of "display" settings (maybe try run the old W7 app: just type "settings" (or the same word in your language) in the "run" section of the menu - this should run the old W7/8 much more reasonable panel).


favorite quote: "When ElaineNualla is posting..I always read..Nutella. I am sorry" by Rosebine
self-claimed "lower-spec simmer"
Lab Assistant
Original Poster
#14 Old 22nd Sep 2019 at 11:33 PM
Quote: Originally posted by ElaineNualla
yeah, they use the same driver as I thought (and the difference between them is negligible in RL scenarios, yes the 8500 have bigger unit count and texture filter rate - not so important nowadays, but memory access - which is crucial, is seriously superior with R5). They both should perform at similar rate in scenarios where theirs performance is countable (these are low-med end GPUs anyway, for today's standards).

You should have an AMD tool installed which - as its NVidia counterpart - should give you ability to select particular GPU for the program. Probably.
Tried somewhat as this: [ https://www.amd.com/en/support/kb/faq/dh-017 ]? In W10 these setting should be buried somewhere inside sub-mess of "display" settings (maybe try run the old W7 app: just type "settings" (or the same word in your language) in the "run" section of the menu - this should run the old W7/8 much more reasonable panel).

Hi, thanks for the info, I hope the GPUs are at least enough for the sims game ...
From the ADM panel, I tried to set up the HD 8500 card for The Sims, and now it marks that The Sims uses that;
I'm trying to use GPU Z and the other program to verify that the sims actually uses that card, and that it doesn't overheat too much, not to get ruined.
Back to top