>
New Topic
>
Topic Locked
Esato Forum Index
>
General discussions >
Rumours
> Sony Xperia Rumors 2015
Bookmark topic
CrownedAkuma Posts: > 500
On 2015-01-07 18:49:59, JohnnyNr.5 wrote:
If the 1080p version of the Z4 has the same specs like the QHD version but with a cheaper price point and better battery life then I'm totally fine.
[ This Message was edited by: JohnnyNr.5 on 2015-01-07 17:50 ]
Couldn't agree more on this
--
Posted: 2015-01-07 19:30:03
Edit :
Quote
On 2014-12-14 12:37:40, nodarsixar wrote:
we have Z4 spec!
5.2 inch
Snapdragon 810
3GB Ram
3300 Mah Battery
21MP Camera
New UI
Best AUDIO
IP68
140gram
LTE CAt 6
huiyi
AND
Z4 Compact- NO
Z4 Ultra - NO
Z5 Compact- YES
4GB RAM?
--
Posted: 2015-01-07 19:37:20
Edit :
Quote
16 GB internal storage needs to die.
--
Posted: 2015-01-07 21:00:31
Edit :
Quote
so the sensor is confirmed to be a new one if it is 21 MP? The old one wasn't very good a 20 MP, I can't imagine what its going to be like if they stretch it to 21 MP.
--
Posted: 2015-01-07 21:05:10
Edit :
Quote
CrownedAkuma Posts: > 500
On 2015-01-07 21:05:10, Gitaroo wrote:
so the sensor is confirmed to be a new one if it is 21 MP? The old one wasn't very good a 20 MP, I can't imagine what its going to be like if they stretch it to 21 MP.
also if I recall it correctly the old one was a 1/2.3" sensor with 20.7 MPX whereas this is a 1/2.4" with 21MPX... it should be even worse?!?
--
Posted: 2015-01-07 22:50:44
Edit :
Quote
I really hope Sony won't release an S810 equipped Z3 without flaps as the "new" Z4. 16GB ROM would be just....
--
Posted: 2015-01-07 23:00:40
Edit :
Quote
you guys are so harsh... but I guess that's what real fans have to be
--
Posted: 2015-01-08 01:03:21
Edit :
Quote
On 2015-01-07 21:05:10, Gitaroo wrote:
so the sensor is confirmed to be a new one if it is 21 MP? The old one wasn't very good a 20 MP, I can't imagine what its going to be like if they stretch it to 21 MP.
It's the same 20.7MP sensor as the Z1, Z2, Z3.
--
Posted: 2015-01-08 05:58:41
Edit :
Quote
On 2015-01-07 18:13:14, ascariss wrote:
Here is a comparison from anandtech, again running Manhattan 1080p (offscreen) with X1’s GPU underclocked to match the performance of the A8X at roughly 33fps.
http://www.anandtech.com/show/8811/nvidia-tegra-x1-preview/3
NVIDIA’s tools show the X1’s GPU averages 1.51W over the run of Manhattan. Meanwhile the A8X’s GPU averages 2.67W, over a watt more for otherwise equal performance. This test is especially notable since both SoCs are manufactured on the same TSMC 20nm SoC process, which means that any performance differences between the two devices are solely a function of energy efficiency.
Not saying this is a super scientific test but I feel the 20nm and the arm cores help in power drainage. I feel the target for this chip will not be phones but tablets of course.
Video playback even @ 4K doesn't use all the resources specially when you have a hardware decoder which will make most of the CPU usage and GPU usage low ( not 0% as they still needs to do something )...
We're talking here about the maximum power usage in the worst real-word scenario, something like heavy gaming...
I'm not saying that Tegra is bad, it's very good but it's not 100% designed to be be a mobile chip that's why NVIDIA uses more than usual power just to compete with other makers.
Maxwell is good, in fact it's very good, they did a hell of a job with it, but lets face it, it's a desktop + laptop part at the end... NV optimised it for mobile, removed somethings, reduced power, using some LP manufacturing process to reduce power... but it can't be compared to a mobile designed fom ground GPU like Adreno & PowerVR for example...
to have some history :
Adreno is actually belongs to ATi which was acquired by AMD later including Adreno, after a while AMD sold all the mobile stuff as the stupid CEO wasn't relying in mobile ( which exploded after less than 2 years )... Qualcomm bought the mobile stuff from AMD including Adreno.
ATi designed Adreno as a graphics processor for mobile and special applications like TV's, Arcade gaming machines and so on, it was used in mobiles also but not that much as mobile were not that kind of hit at that time.
in the same time the have the other GPU for computers ( desktop & mobile ) which was a completely different design, both were separate projects to work with as they're targeting completely different segments, ofcourse they shared some technologies as they're graphics processors at the end, and Qualcomm continued the advances with it...
mobile designed GPU's and desktop/laptop class GPU's are still sharing technologies, for example mobile moved to unified shaders after the desktop class GPU's did the move first, mobile GPU's started to have DX support also, tessellation is another thing that started in discrete GPU's and then found it's way to mobile.
If you know more about PC advance, at the days of Pentium 3 then Pentium 4, AMD introduced Athlon then Athlon XP... that processor changed the balance as Athlon was faster than Intel for the first time in years... Intel tried to catch up with Athlon with Pentium 4 but they did it wrong as they increased the depth of the pipelines, while this can bring faster clock, it will also increase the power usage... they tried but never succeeded, until they tried it once more with Netburst architecture in the latest Pentium 4 designs, they increased the pipelines more as they were looking for 4 & 5GHz speeds they already have 6GHz clocks in their roadmap, but the chip was too hot to even reach 4GHz, consumed too much power and AMD was playing nice with Athlon 64 and Athlon 64 X2 by then. in the same time, another team in Intel was very happy as they designed a new architecture for laptops, it was very good, high performance and low power usage compared to any Pentium 3 & Pentium 4, that was the first Intel Core CPU... Intel then decided to ditch the Pentium completely and use the Core design even for desktops... and so they did and Core 2 was born... only then Intel flipped the formula, the Core 2 had a much shorter pipeline depth but much higher IPC, making it lower in clock but with give more performance, the lower clock meant also much lower power consumption compared to Pentium 4...
only by then Intel regained the crown as the best CPU maker from AMD, and AMD is still struggling as they never thought of power usage, their desktop parts still favours higher clocks to get performance and even their designs wasn't that good to compete, the latest desktop offers are still not that good compared to Intel, but there's still a hope and here were mobile parts come in hand
The main guy that designed Athlon, Athlon 64 in the beginning was smart, and still... that's why those parts were very good, but that guy left AMD and worked in Apple, and you can guess, he is the guy who leads the team which made A7 & A8 Apple processors... he know what he's doing...
before few months AMD re hired this guy again, and he's working on two multiple projects now including a high performance ARM design, and AMD's next x86 architecture also they call it AMD Zen, which should at least compete with Intel, which by that time should have Skylake or even the one after it as it will see the lights in 2016...
--
Posted: 2015-01-08 07:11:00
Edit :
Quote
I guess in 2006 when AMD acquired ATi, back then no one could estimate that how mobile market can growing so much influential. That's why we see PC's market is shrinking in extremely manner of today. In 2008 it was economic disaster AMD need to cut some unit that can't make money and also just year after introduced iPhone 2G, mobile device was starting to caught fire around 2010 and so on till today.
AMD got the master Jim Keller back from Apple, it would be interesting to see if he can make a miracle just once again but today AMD is quite limited resources (ie : money and foundry) and even unfortunately because economic climate around the world doesn't act so well either.
--
Posted: 2015-01-08 15:32:10
Edit :
Quote
New Topic
Topic Locked