Comments Locked

77 Comments

Back to Article

  • Purri - Monday, March 8, 2010 - link

    Ok, so i read a lot of comments that the cheap passive DP-Adapters wont work for a EyeFinity 3 Monitor setup.

    But, can i use this card for a 3 monitor windows-desktop setup without eyefinity - or do i need an expensive adapter for this too?

    I'm looking for a cheapish, passivly(silent) cooled card that supports 3 monitors for windows applications, that has enough performance to play a few old games now and then (like quake3) on 1 monitor.

    Will this card work?
  • waqarshigri - Wednesday, December 4, 2013 - link

    yes of course it has amd eyefinity technology .... i played new games on it like nfs run,call of duty MW3, battlefield 3,
  • plopke - Friday, February 5, 2010 - link

    :o what about the 5830 , wasn't it delayed until the 5th. It is suddenly very quiet about it on all techsite. And not launched today.
  • yyrkoon - Thursday, February 4, 2010 - link

    Your charts are all buggered up. Just looking over the charts, in Crysis: Warhead, you test the nvidia 9600GT for performance. Ok fine. Then we move a long to the Power consumption charts, and you omit the 9600GT for the 9500GT ? Better still, we move to both heat tests, and both of these card are omitted.

    WTH ?! Come on guys, is there something wrong with a bit of consistency ?
  • Ryan Smith - Friday, February 5, 2010 - link

    Some of those cards are out of Anand's personal collection, and I don't have a matching card. We have near-identical hardware that produces the same performance numbers; however we can't replicate the power/noise/temperature data due to differences in cases and environment.

    So I can put his cards in our performance tests, but I can't use his cards for power/temp/noise testing. It's not perfect, but it allows us to bring you the most data we can.
  • yyrkoon - Friday, February 5, 2010 - link

    Well, the only real gripe that I have here is that I actually own a 9600GT. Since we moved last year, and are completely off grid ( solar / wind ), I would have liked to compare power consumption between the two. Without having to actually buy something to find out.

    Oh well, nothing can be done about it now I suppose.

    I can say however that a 9600GT in a P35 system with a Core 2 E6550, 4GB of ram, and 4 Seagate barracudas uses ~167-168W idle. While gaming, the most CPU/GPU intensive games for me were world in conflict, and Hellgate: London. The two games "sucked down" 220-227W at the wall. This system was also moderately over clocked to get the memory and "FSB" at 1:1. Also these numbers are pretty close, but not super accurate, But as close as I can come eyeballing a kill-a-watt while trying to create a few numbers. The power supply was an 80Plus 500W variant. Manufactured by Seasonic if anyone must know( Antec EarthWATTS 500 ).
  • yyrkoon - Friday, February 5, 2010 - link

    Ah I forgot. The numbers I gave for the "complete" system at the wall included powering a 19" WS LCD that consistently uses 23W.
  • dagamer34 - Thursday, February 4, 2010 - link

    Where's the low-profile 5650?? I don't want to downgrade my 4650 to a 5450 just for HD bitstreaming. =/
  • Roy2001 - Thursday, February 4, 2010 - link

    Video game is on XBOX360 and Wii, so i3-530 for $117 is a better solution for me. It supports bitstream through HDMI too. My 2 cents.
  • Taft12 - Thursday, February 4, 2010 - link

    I apologize if this has been confirmed already, but does this mean we won't see a chip from ATI that falls between 5450 and 5670?

    There were four GPUs in this range last gen (4350, 4550, 4650, 4670)
  • Taft12 - Thursday, February 4, 2010 - link

    Replying to my own post to say I reread the 5670 article and indeed the 5500 series is mentioned there and I am VERY sure this is what Ryan was referring to earlier when he said "wait a week"
  • juampavalverde - Thursday, February 4, 2010 - link

    I dont see the improvement over the past generation 4550. DX11 is useless in such slow cards (is almost useless for a 5670!). I expect this level of performance from a next gen IGP, not a discrete chip. AMD should have raised high the performance bar for this generation, releasing something like 5450 (120 sp), 5670 (640 sp, like 4770), 5770 (960 sp), and 80 sp for the IGP.
  • Taft12 - Thursday, February 4, 2010 - link

    The improvement is purely in power consumption. They can't improve performance of these lowest-end discrete cards or IGPs too much or they will eat into the value of the next step up (4670, 5670). You may "expect this level of performance" but you aren't gonna get it.
  • Rick83 - Thursday, February 4, 2010 - link

    I'd be rather interested in seeing a differentail analysis of power consumption of the two ATI cards (5450/5670) at rendering a h264 encoded 1080p movie with dts out via the card, as well as blu-ray with "HD".
    And a video benchmark, to show how much bitrate/fps the respective cards manage, before desync/framedrop/freeze.
    Power during furmark (or gaming) is of course higher on the 5670, because it has five times as many shaders to feed - depending on how smart ATI's power management is, the two cards might not differ a lot, if used in an HTPC.
    And frankly, the 50 dollars extra would be probably worth the extra rendering/decoding horsepower, especially in an HTPC, where you want buttery smooth performance, and not worry about bitrate.
    Oh - any news on passive 5670's? If they can do 5750's, I'm sure there'll be a few 5670's someday?
  • Ryan Smith - Thursday, February 4, 2010 - link

    This isn't something we tested since it really isn't an issue. The Cheese Slices test peaks at 35MBps MPEG-2, and I have an H.264 version that peaks at a similar bitrate.

    The 5450 has enough power for anything up to 1080p Blu-Ray.
  • Rick83 - Thursday, February 4, 2010 - link

    Well, considering the extra shader load added by filters, apparently it may not be - or the proper algorithm for deinterlacing would have been available.

    And that also leaves the question of the power draw of a 5670 at 5450 levels of performance - I'm pretty sure that in an HTPC, unless you use it as a console replacement for gaming, there will ever be a situation where the gpu is fully loaded, hence power input should be lower than the full-tilt number you published.
  • dagamer34 - Thursday, February 4, 2010 - link

    Most HTPCs are meant to be small, and there's no way you're going to be able to fit a 5700 series card into a low-profile space. I know they had 4650s last generation, but there aren't any 5650s yet. =/
  • Redstorm - Thursday, February 4, 2010 - link

    I just cant see how reviewers are claiming this card is the perfect HTPC video card. Not everyone uses Microsofts Media Centre. Lack of VDPAU support for Linux is a glaring hole in all ATI cards, If I were building a gameing PC today I would probably buy an ATI 5870, But if your building a low power HTPC you cant go past nVidia and VDPAU support. Take their ION platform will do 1080p in hardware on Linux for less than 30Watts this beats this ATI offering hands down when you add in Motherboard and CPU.

    Best HTPC card I think not...
  • CiNcH - Thursday, February 4, 2010 - link

    XvBA is also said to be working, which also acts as a backend to VA-API, just like VDPAU. They are even examinig the legal situation and whether the UVD specs can be opened to be support within the OSS (xf86-video-ati) driver. Of course nothing that will be done by tomorrow.

    What I think would be worth mentioning, when it comes to HTPC comparing nVIDIA and ATi, is that UVD won't play H.264 higher than level 4.1. nVIDIA's PureVideo is capable of decoding up to level 5.1.
  • milli - Thursday, February 4, 2010 - link

    The Evergreen series does apply angle independent anisotropic filtering. Also the fixed function interpolators have been removed and moved to the shaders.
    Considering the limited power of the HD 5450, this causes a bigger performance drop compared to the other Evergreen products.
  • uibo - Thursday, February 4, 2010 - link

    Why don't the Radeon "Cheese Slices" video screenshots have horizontal lines? The Nvidia ones have them...
  • Ryan Smith - Thursday, February 4, 2010 - link

    It's an artifact of stepping through the video one frame at a time with MPC-HC with the MS MPEG-2 decoder. Doing so captures the angled lines correctly, but it doesn't quite capture some of the other artifacts exactly the same because it ends up a field (basically half a frame) ahead.

    In motion the Radeon cards are getting it right.
  • blaubart - Tuesday, February 9, 2010 - link

    Congrats Ryan, your Cheese Slices testing outside of a HTPC forum was really a big surprise for me! Keep on pushing AMD/Nvidia to realize that there's more than gamers in this world!

    > Horizontal 1p lines missing:
    I'm also wondering normally the 1p lines are no problem in screenshots (MPC-HC and more). Maybe you shot them during the "odd movement" (1h+3v see description in Cheese Slices thread). I have now edited Post #1: screenshots only during "even movement".

    This link shows it (sorry German):
    http://www.dvbviewer.info/forum/index.php?s=&s...">http://www.dvbviewer.info/forum/index.p...c=34863&...

    1080i-1 ---> odd movement
    1080i-2 ---> even movement

    What's more, GT220.png and G210.png show MA in 1p and "respone - noise" but VA in the ticker, How did you manage this? Ah I see, you left a mailaddress, I will send you a mail.
  • uibo - Thursday, February 4, 2010 - link

    Oh and the Nvidia G210 has some artifacts for the vertical lines with some pixels shifted right.
  • silverblue - Thursday, February 4, 2010 - link

    ...then this might call for an article to look at it in that very light. However, is there a higher performing part for both series that can be directly compared? I'm guessing not, as they all differ in some way, be it shader numbers, ROPs or texture units. The only way I can think to do it would be comparing two 512MB 4850s to a downclocked 5870, but even if that were possible, you'll still get a performance drop due to Crossfire. Hmm.
  • MrSpadge - Thursday, February 4, 2010 - link

    5770 is as close as it gets. THere's the difference in the memory subsystem, though. Could be that the best bench to run would be ShaderMark.
  • GeorgeH - Thursday, February 4, 2010 - link

    You mention the Sapphire's heatsink quite a bit, but I didn't see any pictures of it (at least the reverse side) in the article; was that an oversight or am I blind?
  • Ryan Smith - Thursday, February 4, 2010 - link

    Oversight. I thought I had a stock photo of the rear. I'll get one in the morning.

    If you're really curious, it's the same heatsink that's on their 4350, which there are plenty of pictures of.
  • GeorgeH - Thursday, February 4, 2010 - link

    I see it now, thanks - it's actually not nearly as "bad" as I thought it'd be.
  • Ryan Smith - Thursday, February 4, 2010 - link

    Oversight. I thought I had a stock photo of the rear. I'll get one in the morning.
  • Lifted - Thursday, February 4, 2010 - link

    The first graph on each of the benchmark pages lists a 5670, the second graph lists a 4670. Typo or are you actually using different cards?
  • Ryan Smith - Thursday, February 4, 2010 - link

    It's not a typo. We never ran the 5670 at 1024x768, there was no reason to. It's more than fast enough for at least 1280.

    The 4670 data is from the GT 240 review, which we used 1024 on (because GT 240 couldn't cut the mustard above 1024 at times).
  • 8steve8 - Thursday, February 4, 2010 - link

    should have had the clarkdale igp in there for good measure, if you aren't gaming I'd guess that igp would be the way to go
  • MrSpadge - Thursday, February 4, 2010 - link

    Would have been interesting to compare idle power consumption: Clarkie + IGP vs. Clarkie + 5450.
  • Ryan Smith - Thursday, February 4, 2010 - link

    Testing a Clarkie requires switching out our test rig, so the results wouldn't be directly comparable since it means switching out everything including the processor. Plus Anand we only have a couple of Clarkies, which are currently in use for other projects.

    At this point Clarkie (and any other IGP) is still less than half as fast as 5450.
  • strikeback03 - Thursday, February 4, 2010 - link

    That brings up the point though that with a card this low on the totem pole it might be nice to include a benchmark or two of it paired with similarly low-priced hardware. I understand the reason for generally using the same testbed, but when it is already borderline playable it would be nice to know that it won't get any slower when actually paired with a cheap processor and motherboard.
  • Ryan Smith - Thursday, February 4, 2010 - link

    Testing a Clarkie requires switching out our test rig, so the results wouldn't be directly comparable since it means switching out everything including the processor.

    At this point Clarkie (and any other IGP) is still less than half as fast as 5450.
  • kevinqian - Thursday, February 4, 2010 - link

    Hey Ryan, I'm glad you are the first reviewer to utilize Blaubart's very helpful deinterlacing benchmark. I would just like to note that with ATI, it seems memory bandwidth plays a big part in deinterlacing method as well. For example, the HD 4650 DDR2 can only perform MA deinterlacing, even tho it has the same shaders as the (VA capable) 4670. The only bottleneck there seems to be the DDR2 memory bandwidth. On the other hand, with the HD 4550, though it has DDR3, it is limited to 64bit memory interface, so that seems to be a limiting factor.

    I have an old HD 2600Pro DDR2 AGP. When I OC the memory from 800mhz stock to 1000mhz, VA gets activated by CCC and confirmed in Cheese slices.

    Nvidia's deinterlacing algorithm seem to be less memory intensive as even the GT220 with DDR2 is able to perform VA-like deinterlacing.
  • Ryan Smith - Thursday, February 4, 2010 - link

    Yeah, I've seen the bandwidth idea thrown around. Unfortunately I don't have any additional suitable low-end cards for testing it.
  • ET - Thursday, February 4, 2010 - link

    I think I remember reading that the interpolation of input values in the pixel shader was moved from fixed function units to being done by the shaders.
  • Ryan Smith - Thursday, February 4, 2010 - link

    That is one of the things that changed. However it's not very resource intensive for the shaders (which is one of the reasons why it was moved in the first place) and I don't seriously suspect that's the cause.
  • andy o - Thursday, February 4, 2010 - link

    So far all the 5000 series cards have an issue with PowerPlay. It messes audio especially on Dolby digital and DTS tracks, when DXVA is disabled. When DXVA is enabled the clocks are stabilized (at 400 MHz GPU and 900 MHz memory for the 5770), so Powerplay doesn't screw with the audio. Without using DXVA, the clocks are all over the place (PowerPlay enabled, normally a good thing), and this gives audio dropouts with DD and DTS tracks.

    Could you test this, with HD videos and DD/DTS tracks? Maybe you'll have a better chance of getting this fixed than a bunch of us just dealing with their horrible support.

    Powerplay also triggers funky sound when HDMI is used and 5.1 or 7.1 24-bit 96 kHz or 192 kHz output is set on Windows. Just set it like that and go about your business and you'll hear either crackling or crazy channel switching.

    See here for reference, and the following posts of other users who confirm it, and even come up with their own ways to disable Powerplay. This thread at Doom9 was where it was discovered, and later confirmed by nearly everyone who tried (except one strange case or two).
  • Ryan Smith - Saturday, February 6, 2010 - link

    Andy, shoot me an email. I was going to email you, but I'm not sure the address for you in our system actually goes to an account that gets read.
  • andy o - Monday, February 8, 2010 - link

    just sent you the email, thanks.
  • PR3ACH3R - Friday, February 5, 2010 - link

    [Quote]
    So far all the 5000 series cards have an issue with PowerPlay. It messes audio especially on Dolby digital and DTS tracks, when DXVA is disabled. When DXVA is enabled the clocks are stabilized (at 400 MHz GPU and 900 MHz memory for the 5770), so Powerplay doesn't screw with the audio. Without using DXVA, the clocks are all over the place (PowerPlay enabled, normally a good thing), and this gives audio dropouts with DD and DTS tracks.
    [/quote]

    1. ATI & the Review Sites, including Anandtech,
    have been ignoring this horrible problem in these cards,
    which makes them , for all practical purposes - useless.

    But - It does not stop there.

    2. The problem you have mentioned with The 57xx series,
    creates SERIOUS DPC latencies , especially in XP,
    that brakes even the fastest systems, & All audio is full of glitches & clicks chaos.

    3. To add insult to injury 2D performance is the worst EVER seen on the PC, beaten even By IGPs.


    Bottom Line:
    Anadtech yet again fails to detect & report to these issues,
    So I would not expect any replies to your questions here.

    These cards spell Recall / Class Action all over them.
  • andyo - Friday, February 5, 2010 - link

    Also, I'm not sure what you mean with number 3 (2D performance issues). Could you give some examples so I can test it?
  • PR3ACH3R - Saturday, February 6, 2010 - link

    If You do any sort of regular graphics (ignore the IF..)
    its all 2D ..

    After looking hopelessly for ANY solution or even recognition to this problem from professionals, so I can share it, I found there is only one Site with staff professional & unbiased enough to Note the problem.

    Not only did they notice it, they published 2 giant articles about it, & it is painfully, obviously, certainly NOT Anandtech.

    http://translate.googleusercontent.com/translate_c...">http://translate.googleusercontent.com/...usg=ALkJ...

    http://translate.googleusercontent.com/translate_c...">http://translate.googleusercontent.com/...usg=ALkJ...
  • andyo - Friday, February 5, 2010 - link

    It is a big issue, but I'm not sure if it's a hardware problem. It's probably a driver thing.

    In any case, you can disable Powerplay for the time being, I've done what I linked above and it's working acceptably for me, when I play a game, I'll switch profiles to enable the higher clocks. I'm not sure how it would work on XP though or if the procedure is the same, but you can also use GPU clock tool to stabilize the clocks.
  • Taft12 - Thursday, February 4, 2010 - link

    I have a hard time seeing AMD giving this much attention, the number of users concerned with this issue is infintesimal.

    Why overclock your GPU when you are focused on the audiophile features? I'll be shocked if the official response is anything other than "Graphics card audio output not supported when Powerplay enabled".
  • andy o - Thursday, February 4, 2010 - link

    Oh and BTW, also as the poster above said, it's not an "audiophile" issue. I actually try to distance myself form that term as much as possible. It's happening whenever DXVA is not enabled, and with DD and DTS audio. As in when playing DVDs with (say) PowerDVD with its post-processing features. Pretty normal scenarios. And it's not a subtle thing. It's dropouts (pops or cut outs in the audio). Also, choppy flash video (shouldn't happen with DXVA accelerated video with flash 10.1 though).

    Powerplay also triggers horrible crackling and channel switching when output is set to multichannel (5.1 or 7.1) 96 kHz or 192 kHz audio for the Windows mixer. Hardly audiophile issues at all, any of these.
  • andy o - Thursday, February 4, 2010 - link

    [I got an error, so sorry if this is posted twice.]

    It's not overclocking at all. Powerplay is, as the poster above said, for power efficiency only. It actually doesn't overclock at all, but underclocks when the GPU is not being stressed.

    If you're referring to one of the posts that requires you to enable overdrive, notice that it's only being enabled so you can stabilize the clock (and thus effectively disabling powerplay), but the GPU/mem are actually being underclocked by messing with an xml file and lowering the clocks manually via overdrive.
  • ATWindsor - Thursday, February 4, 2010 - link

    First of all, its not really a "audiophile feature" to get audio without droputs and other problems over HDMI, its devastating for the audio, no matter if you are a audiophile or not, secondly, powerplay is also used for power efficiency. The result is that HDMI audio doesn't work with default-setting for many people, this is a pretty major issue.

    AtW
  • andy o - Thursday, February 4, 2010 - link

    OK so hyperlinks aren' working.

    This is the first thread I linked.
    http://www.avsforum.com/avs-vb/showthread.php?p=17...">http://www.avsforum.com/avs-vb/showthread.php?p=17...

    this is the doom9 thread.
    http://forum.doom9.org/showthread.php?p=1359418#po...">http://forum.doom9.org/showthread.php?p=1359418#po...

    ATI is giving some users the runaround.
  • ereavis - Thursday, February 4, 2010 - link

    try this ATI hotfix
    http://support.amd.com/us/kbarticles/Pages/ATICata...">http://support.amd.com/us/kbarticles/Pages/ATICata...
  • andy o - Thursday, February 4, 2010 - link

    already did, and it's the same with 9.11, 9.12, 9.12 hotfix, 10.1, and version 8.70RC2 (presumably 10.2 RC2).
  • toyota - Thursday, February 4, 2010 - link

    am I missing something? why are you saying Far Cry 2 benchmark cant go lower than high settings? all you have to do is select DX9 and you can choose low or medium from there.
  • Ryan Smith - Thursday, February 4, 2010 - link

    We stick to DX10 mode for benchmarking DX10-enabled games. In fact I never even tried DX9, otherwise I would have noticed that it goes lower.

    Humm...
  • toyota - Thursday, February 4, 2010 - link

    well anybody trying to game on this thing will have to use whatever realistic playable settings are available. that means DX9 for Crysis/Warhead and Far Cry 2 would need to be used.
  • andy o - Thursday, February 4, 2010 - link

    That option has been there for a while, but there's no info on what exactly it does.
  • Ryan Smith - Thursday, February 4, 2010 - link

    Frankly, AMD has never documented it well. It has something to do with working with Windows MMCSS and DXVA to do exactly what the name describes, and that's all I know off-hand.

    It's aptly named though; I've seen a number of examples where enabling it does what's on the label.
  • andy o - Thursday, February 4, 2010 - link

    So far the most reasonable explanation I've seen by googling is someone if a forum suggesting that its function is just disabling certain features so as to prioritize smooth playback over those features. I don't see any difference with the 5770, otherwise (with that card it doesn't disable anything).
  • UNCjigga - Thursday, February 4, 2010 - link

    I sort of assumed it was similar to what 120hz/240hz LCD TVs do: use a frame doubler to more closely match your monitor's refresh rate and give the impression of "smooth" motion.
  • andy o - Thursday, February 4, 2010 - link

    I don't think so, most PC displays are 60 Hz, and I think even most 120 Hz TVs only take up to 60p input. There's only a couple of 120Hz-input monitors.
  • therealnickdanger - Wednesday, February 10, 2010 - link

    Actually, most CRTs using analog connections are capable of 120Hz. DVI, HDMI, and DisplayPort do not support digital transmission speeds over 60Hz. It's a sad state of affairs if you ask me.
  • sc3252 - Thursday, February 4, 2010 - link

    I know this isn't exactly supposed to be a fast card, but its clocked ~10% faster yet its slower than the last generation card... I can't say I am surprised though, after seeing the 5770 clocked faster than the 4870 yet being around the same speed.
  • StevoLincolnite - Thursday, February 4, 2010 - link

    I think people are missing allot of the big picture here and that's Crossfire with the Radeon 54xx series.
    Specifically with the new 8 series of chipsets, hence the amount of shaders present, I expect a return of Hybrid Crossfire.

    Pairing an IGP with a low-end card is a very cost effective solution to getting more performance out of a system and also gives AMD an edge in getting more people to buy an AMD Processor+Chipset+Graphics card.
  • ereavis - Thursday, February 4, 2010 - link

    "me too" I'd dig a 758G Hybrid Crossfire review with this and the other sub $100 Radeons (if they support x-fire) 785 was a great motherboard to match with the Athlon II and Phenom II X2-X3, some of us were waiting on video card purchases and would like to see Crossfire 54XX/56XX compared to a 5750 discrete for example.
  • JarredWalton - Friday, February 5, 2010 - link

    Fun fact: HD 5450 is about 40% faster than my pathetic old 7600 GT in my work PC. Remember when 6600 GT was da bomb? LOL
  • QuietOC - Thursday, February 4, 2010 - link

    The 80 shader discrete Radeons are just too limited by 64-bit DDR3. The 785G has more bandwidth and the same number of ROPS (mine even runs fine at 1GHz.) If they had cut the power usage of the 5450 down a lot more it may have made some sense.
  • Totally - Thursday, February 4, 2010 - link

    5770 128-bit bus, 4870 256-bit bus

    again 5450 64-bit bus, 4550 128-bit bus
  • Ryan Smith - Thursday, February 4, 2010 - link

    4550 (RV710) was a 64bit bus.
  • Ryan Smith - Thursday, February 4, 2010 - link

    Images are a WIP.
  • SLEEPER5555 - Thursday, February 4, 2010 - link

    is proof editing also a work in progress? in a hurry to get the article posted or what!
  • sc3252 - Thursday, February 4, 2010 - link

    Is this a test to see if people will read articles without pictures, because I think I failed... Maybe next time just pull the article until pictures/charts are ready.
  • bonsai57 - Saturday, August 7, 2010 - link

    The article states: The Next Step In HTPC Video Cards.
    (Home Theatre Personal Computer Video Cards??)
    Am I right in this assumption?
    If so, then why are the benchmarks more orientated towards gaming??
    Obviously this is no gaming card.
    Please give a true outlook as to how this card performs connected to a TV.
    Is it OK? Is it Junk?
    My intent for this card would be a connection with a plasma display or perhaps LCD.
    Streaming video through the intenet such as Netflix or Hulu would be one concern.
    Playback through a Blu Ray Player on PC another.
    Perhaps my perception of this card and the subsequent review are irrelevant.
    Please don't game a card that's meant for something entirely different.
    Perhaps I have miscontrued the intent of this article, if so I appologize.
  • Sahana Munasinghe - Saturday, March 9, 2013 - link

    According to PC Wizard 2012 ATI HD 5450 Has 1 GPU & 2 Cores. does it mean HD 5450 has 2 Graphic Processors?
  • ngwvuvpuqzri - Saturday, August 22, 2020 - link

    http://bitly.com/zoom-viber-skype-psy

Log in

Don't have an account? Sign up now