ASRock Industrial NUCS BOX-1360P/D4 Review: Raptor Lake-P Impresses, plus Surprise ECC
by Ganesh T S on January 29, 2023 9:30 AM EST- Posted in
- Systems
- NUC
- UCFF
- Mini-PC
- ASRock Industrial
- Raptor Lake-P
HTPC Credentials
The 2022 Q4 update to our system reviews brings an updated HTPC evaluation suite for systems. After doing away with the evaluation of display refresh rate stability and Netflix streaming evaluation, the local media playback configurations have also seen a revamp. This section details each of the workloads processed on the ASRock NUCS BOX-1360P-D4 as part of the HTPC suite.
YouTube Streaming Efficiency
YouTube continues to remain one of the top OTT platforms, primarily due to its free ad-supported tier. Our HTPC test suite update retains YouTube streaming efficiency evaluation as a metric of OTT support in different systems. Mystery Box's Peru 8K HDR 60FPS video is the chosen test sample. On PCs running Windows, it is recommended that HDR streaming videos be viewed using the Microsoft Edge browser after putting the desktop in HDR mode.
YouTube Streaming Statistics | |||
The GPU in ASRock NUCS BOX-1360P-D4 supports hardware decoding of VP9 Profile 2, and we see the stream encoded with that codec being played back. The streaming is perfect, thanks to the powerful GPU and hardware decoding support - the couple of dropped frames observed in the statistics below are due to mouse clicks involved in bringing up the overlay.
The streaming efficiency-related aspects such as GPU usage and at-wall power consumption are also graphed below.
YouTube Streaming Efficiency | |||
Interestingly, we see both decoder usage and D3D usage going up after enabling ECC. There were no visible dropped frames in the ECC case except during the activation of the OSD overlays. The higher power consumption numbers also contribute to the dismal energy efficiency of the ECC configuration.
Hardware-Accelerated Encoding and Decoding
The transcoding benchmarks in the systems performance section presented results from evaluating the QuickSync encoder within Handbrake's framework. The capabilities of the decoder engine are brought out by DXVAChecker.
Video Decoding Hardware Acceleration in ASRock NUCS BOX-1360P-D4
The iGPU in Raptor Lake-P system supports hardware decode for a variety of codecs including AVC, JPEG, HEVC (8b and 10b, 4:2:0 and 4:4:4), and VP9 (8b and 10b, 4:2:0 and 4:4:4). AV1 decode support is also present. This is currently the most comprehensive codec support seen in the PC space.
Local Media Playback
Evaluation of local media playback and video processing is done by playing back files encompassing a range of relevant codecs, containers, resolutions, and frame rates. A note of the efficiency is also made by tracking GPU usage and power consumption of the system at the wall. Users have their own preference for the playback software / decoder / renderer, and our aim is to have numbers representative of commonly encountered scenarios. Our Q4 2022 test suite update replaces MPC-HC (in LAV filters / madVR modes) with mpv. In addition to being cross-platform and open-source, the player allows easy control via the command-line to enable different shader-based post-processing algorithms. From a benchmarking perspective, the more attractive aspect is the real-time reporting of dropped frames in an easily parseable manner. The players / configurations considered in this subsection include:
- VLC 3.0.18
- Kodi 20.0b1
- mpv 0.35 (hwdec auto, vo=gpu-next)
- mpv 0.35 (hwdec auto, vo=gpu-next, profile=gpu-hq)
Fourteen test streams (each of 90s duration) were played back from the local disk with an interval of 30 seconds in-between. Various metrics including GPU usage, at-wall power consumption, and total energy consumption were recorded during the course of this playback.
All our playback tests were done with the desktop HDR setting turned on. It is possible for certain system configurations to automatically turn on/off the HDR capabilities prior to the playback of a HDR video, but, we didn't take advantage of that in our testing.
VLC Playback Efficiency | |||
While playback was perfect for all codecs except AV1 (the CPU is not strong enough for software-only 8Kp60 decoding), the power consumption numbers are off a relatively high idle base. This results in the workload energy consumption being in the lower half of the pack for both configurations.
Kodi Playback Efficiency | |||
The scenario seen with VLC is replicated in Kodi also, with the high idle power consumption base driving up the energy numbers even though the delta is quite reasonable.
mpv (Default) Playback Efficiency | |||
mpv playback with the gpu-next video output driver is the most energy efficient of the lot. We also have hardware accelerated decode for AV1. However, the playback for that clip still has issues, with approximately 60% of the frames getting dropped in the video output (the decoder itself doesn't drop any frames).
This may warrant investigation by the mpv / gpu-next developers and/or Intel's driver team. It does appear to be a software issue that can be resolved in the long run.
mpv (GPU-HQ) Playback Efficiency | |||
Activating the GPU shaders for video post processing does result in increased energy consumption, but there are no dropped frames. The 8Kp60 AV1 decode video output issue remains the same irrespective of the profile used.
30 Comments
View All Comments
drajitshnew - Sunday, January 29, 2023 - link
The in band ECC is an absolutely brilliant idea for systems with 64 GB or more. It is unfortunate that windows does not support it.Samus - Sunday, January 29, 2023 - link
My understanding is this doesn't need support at the software level. This is still "hardware ECC" and OS-independent.Samus - Sunday, January 29, 2023 - link
Oh, I see what you are saying. About how Windows will handle an error. In AT's memtest run the test triggered a stop interrupt presumably as it didn't know how to handle the error. I see what you are getting at with Windows.bernstein - Monday, January 30, 2023 - link
it's more likely, that chrome mandates ecc support, while with windows intel pushes ecc as $$$ featuresjkpublic@gmail.com - Monday, February 13, 2023 - link
This competes with laptops. Please expand on why ECC is coming up?mode_13h - Tuesday, February 14, 2023 - link
> Please expand on why ECC is coming up?This is sold as an industrial mini-PC. For something like that, reliability is key. Memory errors are one potential source of reliability problems, and ECC is an effective measure to compensate (short-term) and flag for replacement (long-term) any defective memory modules or boards.
The lore behind ECC is that it protects against cosmic rays, but I've only personally seen ECC errors that seem tied to flaky or failing hardware. It's worthwhile even for that purpose, alone.
TLindgren - Sunday, January 29, 2023 - link
It needs to be noted that SECDED over 512 bit is FAR less powerfull in handling errors than SECDED over 64-bit like regular ECC (or SECDED over 32-bit using DDR5 ECC sticks). They could have instead emulated the SECDED over each 64-bit chunk but then the extra reserved memory would have needed to be 8GB instead of 2GB, and the performance penalty likely would have been sigificantly worse.SECDED means it's guaranteed to correct one incorrect bit (SEC) and detect two incorrect bits (DED), no warranties for what happen with more incorrect bits but there's a decent statistical chance it'll detect them (but no chance it'll fix them).
Obviously getting two or even three+ faulty bits in the same "group" is far more likely over 512-bit compared to 64-bit, in fact it's my understanding that it'll likely happen most of the time given how memory sticks are constructed!
It's still useful because it'll detect a certain percentage of the multi-bit error so you will often? get told that you that you have faulty memory (except this doesn't seem to work) before things crash which means you know you need to fix the hardware, but the "correct bits" part is unlikely to save you because at least some of the time it'll get multiple wrong bits in the burst. I suspect they would have been better of with just giving up on correcting and aiming for "detect as many bit errors as we can" (probably 3-4 guaranteed bit detected with the 16-bit of extra data per 512bit they choose).
It's definitely better than no ECC *if* the software support gets improved a bit, but is in no way comparable to "real" ECC. OTOH, it's not priced as that either but it needs to be pointed out because some people will sell it as if it is.
ganeshts - Monday, January 30, 2023 - link
Taken standalone, you arguments are completely sound.However, in the bigger picture, you should note that newer memory technologies include link ECC to protect the high-speed communication link between the SoC and the external memory, AND, the DRAM DIMMs themselves implement transparent ECC for the stored data.
Overall, even mission-critical requirements like ASIL / ISO26262 (for automotive safety) can be met with the requisite FIT (failure-in-time) rate using SECDED protection for 512-bit blocks *assuming those other protection mechanisms are also in place*.
In-band ECC is also used on Tegra for such embedded applications [ https://twitter.com/never_released/status/13559704... ; I can't seem to dig up the original documentation, but remember this was heavily discussed when the Tegra feature was made public ].
ganeshts - Monday, January 30, 2023 - link
(Correction: DRAM DIMMs -> The memory chips)mode_13h - Tuesday, February 14, 2023 - link
> you should note that newer memory technologies include link ECC to protect the high-speed communication link between the SoC and the external memoryAre you saying the system you reviewed also supports traditional out-of-band ECC? Why wasn't that mentioned in the review? If not, then your point would seem to be moot.
I also don't see the point of using in-band ECC atop OOB ECC. Anything that OOB ECC can't correct doesn't seem like it's going to be correctable by in-band ECC.