Quantcast
Channel: Xilinx Wiki : Xilinx Wiki - all changes
Viewing all 11776 articles
Browse latest View live

Zynq UltraScale+ MPSoC - ZCU106 HDMI Example Design

$
0
0
...
Author
Description of Revisions
Date
Author
Comment
More Revisions
Mar 14, 2013 12:04 pm
{http://www.wstest.wikispaces.net/i/user_none_sm.jpg} veena veena
Mar 11, 2013 9:10 am
{http://www.wstest.wikispaces.net/i/user_none_sm.jpg} veena veena
June 22, 2018
1.0
E.Srikanth/M Damoder
First HDMI Example Design of ZCU106

Description/Summary
Describe whatThis technical article provides you are doing or whatan overview of the ZCU106 HDMI Example design which also leverages the Video Codec Unit (VCU) hard block on the Zynq UltraScale+ MPSoC EV Devices. This article uses Vivado IP Integrator (IPI) flow for building the hardware design and Xilinx Yocto PetaLinux flow for software design.
The ZCU106 HDMI Example design has a HDMI Receiver capture pipeline implemented in the PL to which a video source attached. It also has a HDMI Tx display pipeline implemented in the PL to which a HDMI Display
is connected.
Finally
the issueuser guide also demonstrates encode and how this example addresses it.decode capabilities of the VCU block using the video captured and displayed by the HDMI Receive and Transmitter pipelines.
{zcu106_board_hdmi_refdsgn.png} ZCU106 Evaluation Platform
The ZCU106 HDMI Example Design uses the following IPs along with the Zynq UltraScale+ Processing System for demonstrating video capture, encode, decode, display and streaming using the VCU block on Zynq UltraScale+ MPSoC EV devices.
Zynq UltraScale+ MPSoC Processing System
HDMI Receiver Pipeline containing the following IPs
HDMI RX Subsystem
Video Scaler
Frame Buffer Write IP
HDMI Transmitter Pipeline containing the following IPs
HDMI TX Subsystem
Video Mixer
Frame Buffer Read IP
I2C Controller
Video Codec Unit (VCU)
{ZCU!)6_HDMI BLCOK DIAGRAM.png} ZCU106 HDMI Example Design Block Diagram

Implementation
Implementation Details

Zynq UltraScale+ MPSoC - ZCU106 HDMI Example Design

$
0
0
...
Description/Summary
This technical article provides you an overview of the ZCU106 HDMI Example design which also leverages the Video Codec Unit (VCU) hard block on the Zynq UltraScale+ MPSoC EV Devices. This article uses Vivado IP Integrator (IPI) flow for building the hardware design and Xilinx Yocto PetaLinux flow for software design.
...
is connected.
Finally the user guide also demonstrates encode and decode capabilities of the VCU block using the video captured and displayed by the HDMI Receive and Transmitter pipelines.
{zcu106_board_hdmi_refdsgn.png} ZCU106 Evaluation Platform
...
Video Codec Unit (VCU)
{ZCU!)6_HDMI BLCOK DIAGRAM.png} ZCU106 HDMI Example Design Block Diagram
Download, Installation and Licensing
The Vivado Design Suite User Guide explains how to download and install the Vivado® Design Suite tools, which includes the Vivado Integrated Design Environment (IDE), High Level Synthesis tool, and System Generator for DSP. This guide also provides the information about licensing and administering evaluation and full copies of Xilinx design tools and intellectual property (IP) products. The Vivado Design Suite can be downloaded from here.
LogiCORE IP Licensing
The following IP cores require a license to build the design.
Video Test Pattern Generator (TPG) - Free License but must be downloaded
Video Timing controller (VTC) - Free License but must be downloaded
Video Mixer- Purchase license (Hardware evaluation available)
Video PHY Controller - Free License but must be downloaded
HDMI Rx/Tx Subsystem - Purchase license (Hardware evaluation available)
Video Processing Subsystem (VPSS) - Purchase license (Hardware evaluation available)
MIPI CSI Controller Subsystems (mipi_csi2_rx_subsystem)- Purchase license (Hardware evaluation available
To obtain the LogiCORE IP license, please visit the respective IP product page and get the license.
AR# 44029 - Licensing - LogiCORE IP Core licensing questions?
Xilinx Licensing FAQ
LogiCORE IP Project License Terms
Building the Hardware and Software
This user guide is accompanied by a ZCU106 HDMI Example Design files (zcu106_hdmi_ex_2018.1.zip).
Download the this zip file to your local directory or folder of your Windows or Linux machine to run the hardware and software building steps as mentioned in the further sections of this document.
NOTE:
It is recommended to use the Linux for building the Vivado project.
Building Hardware Design
This sections explains the steps to build the ZCU106 HDMI Example design .
Refer to the Vivado Design Suite User Guide UG973 (v2018.1) for setting up Vivado 2018.1 environment.

Implementation
Implementation Details

Zynq UltraScale+ MPSoC - ZCU106 HDMI Example Design

$
0
0

Zynq UltraScale+ MPSoC - ZCU106 HDMI Example Design

$
0
0
...
The Vivado Design Suite User Guide explains how to download and install the Vivado® Design Suite tools, which includes the Vivado Integrated Design Environment (IDE), High Level Synthesis tool, and System Generator for DSP. This guide also provides the information about licensing and administering evaluation and full copies of Xilinx design tools and intellectual property (IP) products. The Vivado Design Suite can be downloaded from here.
LogiCORE IP Licensing
...
build the design.design
Video Test Pattern Generator (TPG) - Free License but must be downloaded
Video Timing controller (VTC) - Free License but must be downloaded
...
To obtain the LogiCORE IP license, please visit the respective IP product page and get the license.
AR# 44029 - Licensing - LogiCORE IP Core licensing questions?
Xilinxhttps://www.xilinx.com/products/design-tools/faq.html|Xilinx Licensing FAQFAQ]]
LogiCORE IP Project License Terms
Building the Hardware and Software
...
It is recommended to use the Linux for building the Vivado project.
Building Hardware Design
...
HDMI Example design .design.
Refer to the Vivado Design Suite User Guide UG973 (v2018.1) for setting up Vivado 2018.1 environment.
Implementation

Zynq UltraScale+ MPSoC - ZCU106 HDMI Example Design

$
0
0

ZynqMP DisplayPort Linux driver

$
0
0
...
NOTE: that the TPG command below should only be used before running modetest -P.
modetest –D fd4a0000.zynqmp-display –w 34:tpg:1 –w 35:alpha:100
Live Input
The live input can be configured through Xilinx bridge interface (https://gitenterprise.xilinx.com/Linux/linux-xlnx/blob/xilinx-v2018.2/drivers/gpu/drm/xlnx/xlnx_bridge.h#L65), which is the kernel level APIs. The interface is available through debugfs for userspace application to control.
The live input requires the clock and timing to be generated from PL, so that such requirement should be reflected in the PL design. The clock should be specified in the devicetree, with 'dp_live_video_in_clk' (https://gitenterprise.xilinx.com/Linux/linux-xlnx/blob/xilinx-v2018.2/Documentation/devicetree/bindings/display/xlnx/xlnx%2Czynqmp-dpsub.txt#L20). The driver will automatically switches to PL clock when switching to live input. The DT binding doc describes as below.
- clocks: phandles for axi, audio, non-live video, and live video clocks.
axi clock is required. Audio clock is optional. If not present, audio will
be disabled. One of non-live or live video clock should be present.
- clock-names: The identification strings are required. "aclk" for axi clock.
"dp_aud_clk" for audio clock. "dp_vtc_pixel_clk_in" for non-live video clock.
"dp_live_video_in_clk" for live video clock (clock from programmable logic).
As an example, if the si570 is connected to DP live clock through FPGA logic, the DT change is as below.
diff --git a/arch/arm64/boot/dts/xilinx/zynqmp-clk-ccf.dtsi b/arch/arm64/boot/dts/xilinx/zynqmp-clk-ccf.dtsi
index a02ad79..dd747d0 100644
--- a/arch/arm64/boot/dts/xilinx/zynqmp-clk-ccf.dtsi
+++ b/arch/arm64/boot/dts/xilinx/zynqmp-clk-ccf.dtsi
@@ -272,7 +272,7 @@
};
&zynqmp_dpsub {
- clocks = <&dp_aclk>, <&clk 17>, <&clk 16>;
+ clocks = <&dp_aclk>, <&clk 17>, <&clk 16>, <&si570_1>;
};
&xlnx_dpdma {
diff --git a/arch/arm64/boot/dts/xilinx/zynqmp.dtsi b/arch/arm64/boot/dts/xilinx/zynqmp.dtsi
index fb196a1..acd7cf2 100644
--- a/arch/arm64/boot/dts/xilinx/zynqmp.dtsi
+++ b/arch/arm64/boot/dts/xilinx/zynqmp.dtsi
@@ -1151,7 +1151,8 @@
interrupt-parent = <&gic>;
clock-names = "dp_apb_clk", "dp_aud_clk",
- "dp_vtc_pixel_clk_in";
+ "dp_vtc_pixel_clk_in",
+ "dp_live_video_in_clk";
power-domains = <&pd_dp>;
There are two live inputs available, video and graphics, and two debugfs entries are available.
/sys/kernel/debug/xlnx-bridge/xlnx_bridge-vid-layer
/sys/kernel/debug/xlnx-bridge/xlnx_bridge-gfx-layer
Some operations can be done through debugfs: enable / disable / set_input:
https://gitenterprise.xilinx.com/Linux/linux-xlnx/blob/xilinx-v2018.2/drivers/gpu/drm/xlnx/xlnx_bridge.c#L310. But please note, debugfs is not treated as stable ABI, so any application shouldn't rely on specific ABI, meaning it may change from one version to another.
The below command will set video layer to use live input.
echo "set_input 1920 1080 0x100e" > /sys/kernel/debug/xlnx-bridge/xlnx_bridge-vid-layer
echo "enable" > /sys/kernel/debug/xlnx-bridge/xlnx_bridge-vid-layer
For 'set_input', 1st arg = width, 2nd arg = height, 3rd arg = media bus format (from media-bus-format.h). 0x100e maps to RGB24 (
https://gitenterprise.xilinx.com/Linux/linux-xlnx/blob/xilinx-v2018.2/include/uapi/linux/media-bus-format.h#L49)
This assumes that the logic connected to live input (ex, TPG / VTC) are already configured prior to enable command. Then the live input can be disabled as below.
echo "disable" > /sys/kernel/debug/xlnx-bridge/xlnx_bridge-vid-layer

ALSA
The "aplay" is an ALSA (Linux audio/sound subsystem) example and demonstrates the audio play. The command sends audio data on specific pcm channel which is audible at DPout.

ZynqMP DisplayPort Linux driver

$
0
0
...
modetest –D fd4a0000.zynqmp-display –w 34:tpg:1 –w 35:alpha:100
Live Input
...
level APIs. The APIs can be used directly by the client driver, ex V4L TPG driver. The interface is also available through
The live input requires the clock and timing to be generated from PL, so that such requirement should be reflected in the PL design. The clock should be specified in the devicetree, with 'dp_live_video_in_clk' (https://gitenterprise.xilinx.com/Linux/linux-xlnx/blob/xilinx-v2018.2/Documentation/devicetree/bindings/display/xlnx/xlnx%2Czynqmp-dpsub.txt#L20). The driver will automatically switches to PL clock when switching to live input. The DT binding doc describes as below.
- clocks: phandles for axi, audio, non-live video, and live video clocks.
...
/sys/kernel/debug/xlnx-bridge/xlnx_bridge-vid-layer
/sys/kernel/debug/xlnx-bridge/xlnx_bridge-gfx-layer
...
/ set_input:
https://gitenterprise.xilinx.com/Linux/linux-xlnx/blob/xilinx-v2018.2/drivers/gpu/drm/xlnx/xlnx_bridge.c#L310. But please note, debugfs is not treated as stable ABI, so any application shouldn't rely on specific ABI, meaning it may change from one version to another.
The below command will set video layer to use live input.

ZynqMP DisplayPort Linux driver

$
0
0
...
modetest –D fd4a0000.zynqmp-display –w 34:tpg:1 –w 35:alpha:100
Live Input
...
application to control.control, as this feature is experimental stage as of 2018.2 release.
The live input requires the clock and timing to be generated from PL, so that such requirement should be reflected in the PL design. The clock should be specified in the devicetree, with 'dp_live_video_in_clk' (https://gitenterprise.xilinx.com/Linux/linux-xlnx/blob/xilinx-v2018.2/Documentation/devicetree/bindings/display/xlnx/xlnx%2Czynqmp-dpsub.txt#L20). The driver will automatically switches to PL clock when switching to live input. The DT binding doc describes as below.
- clocks: phandles for axi, audio, non-live video, and live video clocks.

Zynq UltraScale MPSoC VCU TRD 2018.1 - SDI Design

$
0
0
...
Power on the board; make sure INIT_B, done and all power rail LEDs are lit green.
After ~30 seconds, the display will turn on and the application will start automatically, targeting the max supported resolution of the monitor (3840x2160).
source% source /media/card/autostart.sh from command line.
NOTE: The SD card file system is mounted at /media/card. Optional storage medium SATA and USB are mounted at /media/sata and /media/usb respectively.
1.3.1 GStreamer Application (vcu_gst_app)

Zynq UltraScale MPSoC VCU TRD 2018.1 - SDI Design

$
0
0
...
On Linux:
Open a Linux terminal
...
directory to $TRD_HOME/sdi/pl
To
$TRD_HOME/pl
Apply VIVADO patch related to HDMI AR71203 ( refer http://xkb/Pages/71/71203.aspx)
Following designs can be generated with the provided script files
VCU TRD Project => Multi stream VCU TRD design supporting 3-HDMI, TPG & MIPI video pipelines
SDIRX_VCU_SDITX ==> VCU based video design showcasing SDI receive and SDI transmit capabilities of ZCU106 board
HDMIRX_VCU ==> VCU based mini reference design showcasing HDMI receive along VCU capabilities of ZCU106 & MPSoC
SDIRX_VCU ==> VCU based mini reference design showcasing HDMI receive and VCU capabilities of ZCU106 & MPSoC
Enter following different commands in Vivado shell for creating corresponding design
=> For creating VCU TRD Hardware design
/> vivado -source scripts/vcu_trd_proj.tcl
=> For creating "SDIRX + VCU + SDI_TX" Hardware design
/> vivado -source scripts/sdirx_vcu_sditx_proj.tcl
=> For creating "HDMIRX + VCU" Hardware design
/> vivado -source scripts/hdmirx_vcu_proj.tcl
=> For creating "SDIRX + VCU" Hardware design
/> vivado -source scripts/sdirx_vcu_proj.tcl
Run the following command in vivado shell To
create the
...
invoke the GUI, run the following command.GUI for VCU TRD hardware design
% vivado -source scripts/create_project.tclscripts/sdirx_vcu_sditx_proj.tcl
After executing the script, the vivado IPI block design comes up as shown in the below figure.
{vcu_trd_17p3_fig6.png}
...
In the Export Hardware Platform for SDK window select "Include bitstream" and click "OK".
{vcu_trd_17p3_fig10.png}
...
created at $TRD_HOME/sdi/pl/project/zcu106_vcu_trd.sdk/zcu106_vcu_trd_wrapper.hdf$TRD_HOME/pl/sdirx_vcu_sditx_proj/zcu106_sdirx_vcu_sditx.sdk/zcu106_vcu_trd_wrapper.hdf
Similar tp step 5 HDFs for other designs can be generated.HDF paths for other designs are :
pl/sdirx_vcu_sditx_proj/zcu106_sdirx_vcu_sditx.sdk
pl/sdirx_vcu_proj/zcu106_sdirx_vcu.sdk
pl/hdmirx_vcu_proj/zcu106_hdmirx_vcu.sdk

x5 Tutorials-5.2 Build Flow-5.2.2 VCU PetaLinux BSP1.2.2 VCU PetaLinux BSP
This tutorial shows how to build the Linux image and boot image using the PetaLinux build tool.
...
% echo $PETALINUX
Post PetaLinux installation $PETALINUX environment variable should be set.
Configure and build the PetaLinux project.project for vcu trd design.
% cd $TRD_HOME/sdi/apu/vcu_petalinux_bsp$TRD_HOME/apu/
% petalinux-config --get-hw-description=$TRD_HOME/sdi/pl/project/zcu106_vcu_trd.sdk --oldconfigpetalinux-create -t project -s xilinx-sdi-rx-vcu-sdi-tx-trd-zcu106-zu7-es2-v2018.1-final.bsp
% petalinux-build

NOTE:
In above step user can optionally pass pre-generated HDF ($TRD_HOME/sdi/apu/vcu_petalinux_bsp/hw-description/system.hdf).
If
If Vivado project
...
modified, then --oldconfig can't be used. User needuser is expected to clean upconfigure the directorybsp with the modified hdf file and rebuild from scratch.
If petalinux BSP is built on NFS, set CONFIG_TMP_DIR_LOCATION to non-NFS filesystem. e.g.: CONFIG_TMP_DIR_LOCATION="/tmp/vcu_petalinux_bsp"
build.
% petalinux-config --get-hw-description=$TRD_HOME/pl/sdirx_vcu_sditx_proj/zcu106_vcu_trd.sdk
$TRD_HOME/pl/sdirx_vcu_sditx_proj/zcu106_vcu_trd.sdk/

Change the TMPDIR location (Yocto Settings -> TMPDIR Location ---> set TMP dir location).
Build all Linux imageSDK components
% petalinux-build --sdk
Create a
...
FSBL, ATF, bitstreambitstream, and u-boot.
...
cd images/linux %
%
petalinux-package --boot
Copy the generated boot image and Linux image to the SD card directory. Replace rev-x with board revision
...
BOOT.BIN image.ub $TRD_HOME/sdi/images/rev-x
x5 Tutorials-5.2 Build Flow-5.2.3 VCU APM library1.2.3
$TRD_HOME/apu/vcu_trd_main/images/rev-x
1.2.3
VCU APM
This tutorial shows how to build the VCU APM library.
NOTE: SYSROOT needs to be pre-configured before building VCU libraries.
% source $TRD_HOME/sdi/apu/vcu_petalinux_bsp/images/linux/sdk/environment-setup-aarch64-xilinx-linux<compiled petalinux bsp path>/images/linux/sdk/environment-setup-aarch64-xilinx-linux
Open the SDK workspace using the Xilinx SDK tool.
% cd $TRD_HOME/sdi/apu % xsdk -workspace . &
...
Execution of the application is shown below:
vcu_gst_app <input.cfg>
To calculate latency of pipeline:
GST_DEBUG="GST_TRACER:7" GST_TRACERS="latency;scheduletime" ./vcu_gst_app ./input.cfg

1.3.2 GStreamer Interface Library (vcu_gst_lib)
The VCU GStreamer v1.0 interface configures various video pipelines in the design and controls the data flow through these pipelines. It implements these features:

Zynq UltraScale MPSoC VCU TRD 2018.1

$
0
0
...
5.3.2 GStreamer Application (vcu_gst_app)
The vcu_gst_app is a command line multi-threaded Linux application that uses the vcu_gst_lib interface similar to vcu_qt. The only difference is that the user needs to manually feed the input and run the pipeline each time whereas in vcu_qt, it has to launch only once. The command line application requires an input configuration file (input.cfg) to be provided in the plain text. Refer to the Appendix A - Input Configuration File (input.cfg) for the file format and description.
To calculate latency of pipeline:
GST_DEBUG="GST_TRACER:7" GST_TRACERS="latency;scheduletime" ./vcu_gst_app ./input.cfg

5.3.3 GStreamer Interface Library (vcu_gst_lib)
The VCU GStreamer v1.0 interface configures various video pipelines in the design and controls the data flow through these pipelines. It implements these features:

Zynq UltraScale MPSoC VCU TRD 2018.1 - SDI Design

$
0
0
...
=> For creating "SDIRX + VCU" Hardware design
/> vivado -source scripts/sdirx_vcu_proj.tcl
...
GUI for SDIRX VCU TRDSDITX hardware design
% vivado -source scripts/sdirx_vcu_sditx_proj.tcl
After executing the script, the vivado IPI block design comes up as shown in the below figure.
...
NOTE:
If Vivado project is modified, then user is expected to configure the bsp with the modified hdf file and build.
% petalinux-config --get-hw-description=$TRD_HOME/pl/sdirx_vcu_sditx_proj/zcu106_vcu_trd.sdk
$TRD_HOME/pl/sdirx_vcu_sditx_proj/zcu106_vcu_trd.sdk/
--get-hw-description=$TRD_HOME/pl/sdirx_vcu_sditx_proj/zcu106_sdirx_vcu_sditx.sdk
$TRD_HOME/pl/sdirx_vcu_sditx_proj/zcu106_sdirx_vcu_sditx.sdk/

Change the TMPDIR location (Yocto Settings -> TMPDIR Location ---> set TMP dir location).
Build SDK components

Zynq UltraScale MPSoC VCU TRD 2018.1

$
0
0
...
{vcu_trd_17p3_fig10.png}
The HDF is created at $TRD_HOME/pl/vcu_trd_proj/zcu106_vcu_trd.sdk/zcu106_vcu_trd_wrapper.hdf
...
tp step 55, HDFs for
...
are :
pl/sdirx_vcu_sditx_proj/zcu106_sdirx_vcu_sditx.sdk
pl/sdirx_vcu_proj/zcu106_sdirx_vcu.sdk
pl/hdmirx_vcu_proj/zcu106_hdmirx_vcu.sdk

$TRD_HOME/pl/sdirx_vcu_sditx_proj/zcu106_sdirx_vcu_sditx.sdk
$TRD_HOME/pl/sdirx_vcu_proj/zcu106_sdirx_vcu.sdk
$TRD_HOME/pl/hdmirx_vcu_proj/zcu106_hdmirx_vcu.sdk

x5 Tutorials-5.2 Build Flow-5.2.2 VCU PetaLinux BSP5.2.2 VCU PetaLinux BSP
This tutorial shows how to build the Linux image and boot image using the PetaLinux build tool.

Zynq UltraScale MPSoC VCU TRD 2018.1

$
0
0
...
The HDF is created at $TRD_HOME/pl/vcu_trd_proj/zcu106_vcu_trd.sdk/zcu106_vcu_trd_wrapper.hdf
Similar tp step 5, HDFs for other designs can be generated.HDF paths for other designs are :
$TRD_HOME/pl/sdirx_vcu_sditx_proj/zcu106_sdirx_vcu_sditx.sdk
$TRD_HOME/pl/sdirx_vcu_proj/zcu106_sdirx_vcu.sdk
$TRD_HOME/pl/hdmirx_vcu_proj/zcu106_hdmirx_vcu.sdk
$TRD_HOME/pl/sdirx_vcu_sditx_proj/zcu106_sdirx_vcu_sditx.sdk/zcu106_sdirx_vcu_sditx_top
$TRD_HOME/pl/sdirx_vcu_proj/zcu106_sdirx_vcu.sdk/zcu106_sdirx_vcu_sditx_top
$TRD_HOME/pl/hdmirx_vcu_proj/zcu106_hdmirx_vcu.sdk/zcu106_uhdsdi_demo

x5 Tutorials-5.2 Build Flow-5.2.2 VCU PetaLinux BSP5.2.2 VCU PetaLinux BSP
This tutorial shows how to build the Linux image and boot image using the PetaLinux build tool.

Solution ZynqMP PL Programming

$
0
0
...
Software Flow
{flow.PNG}
...
the required bit-stream.bitstream.
References:
https://github.com/Xilinx/embeddedsw/tree/master/lib/sw_services/xilfpga/doc
...
Features supported in the driver
Full-Bitstream Bitstream loading.
...
Reconfiguration (Partial bit-streambitstream loading).
Encrypted Bitstream loading with user key.
Authenticated Bitstream loading.
...
It is capable of loading only .bin format files into PL. It does not support any other file format.
No support for PS-PL configuration.
...
for Partial Bit-streamBitstream loading with
No out-of-box automated solution for creating Device Tree Overlays with Xilinx Petalinux flow
No support for Read-back Bitstream.
...
https://github.com/Xilinx/linux-xlnx/blob/master/Documentation/devicetree/bindings/fpga/fpga-region.txt
FPGA programming using Device Tree Overlay (DTO)
...
probed after bit-streambitstream programming
DTO contains:
Target FPGA Region
...
Child devices
- child nodes corresponding to hardware that will be loaded in this region of the FPGA.
...
programming the bit-streambitstream using overlay:
Steps to remove the drivers got added as part of DTO: Refer
Example:
...
Steps to Load Full Bitstream
Once the Linux is up run the below commands to load the required Full Bitstream.
...
for Full Bit-stream.Bitstream.
echo 0 > /sys/class/fpga_manager/fpga0/flags
2) Copy the Full BitStream (.bin) and pl.dtbo files into firmware folder

Xilfpga

$
0
0
...
No support for Only Encrypted Bitstream loading with Device key.
Known Issues and Limitations
Encrypted Bit-streamBitstream loading support
It is capable of loading only .bin format files into PL.it will not support the other file formats.
It will not support the combination of both the Authentication and encrypted Bitstream loading.

Zynq UltraScale MPSoC Software Acceleration TRD 2018.1

$
0
0
...
Updated all projects, IPs, and tools versions to 2018.1
2 Introduction
...
how to set-upset up the hardware
...
evaluation kit (Rev 1.0 with production silicon).(board revision 1.0).
3 About the TRD
The Software Acceleration TRD is an embedded signal processing application designed to showcase various features and capabilities of the Zynq UltraScale+ MPSoC ZU9EG device for the embedded domain. The TRD consists of two elements: The Zynq UltraScale+ MPSoC Processing System (PS) and a signal processing application implemented in Programmable Logic (PL). The MPSoC allows you to implement a signal processing algorithm that performs Fast Fourier Transform (FFT) on samples (coming from Test Pattern Generator (TPG) in Application Processing Unit (APU) or System Monitoring (SYSMON) through an external channel) either as a software program running on the Zynq UltraScale+ MPSoC based PS or as a hardware accelerator inside the PL. The design has three accelerator cores generated using SDx for computing 4096, 16384, and 65536 point FFTs. The data transfers of the SDx accelerators is controlled by the APU. There is one accelerator (FFT IP from the Vivado IP catalog) for 4096 point FFT controlled by the Real-Time Processing Unit (RPU). The TRD demonstrates how to seamlessly switch between a software or a hardware implementation and to evaluate the cost and benefit of each implementation. The TRD also demonstrates the value of offloading computation-intensive tasks onto PL, thereby freeing the CPU resources to be available for user-specific applications.
...
apu
Contains the software source files
petalinuxpetalinux_bsp
Contains the PetaLinux project's configuration
Qt_guiswaccel_app
Contains GUI sourcesapplication project
zcu102_swaccel_trd
SDx folder containg the hardware platform, pfm files and FFT accelerator C sources.
...
IMPORTANT_NOTICE_CONCERNING_THIRD_PARTY-CONTENT.txt
Contains information about the third party licences
Pre-requisites:Prerequisites
ZCU102 Evaluation Kit (Rev 1.0 with production silicon)
A Linux development PC with following tools installed:
Xilinx Vivado Design Suite 2018.1
Xilinx SDx 2018.1
PetalinuxPetaLinux 2018.1
Distributed version control system Git installed. For information, refer to the Xilinx Git wiki.
GNU make utility version 3.81 or higher.
...
Before running the demo:
Format the SD-MMC card as FAT32 using a SD-MMC card reader. Copy the contents of the $TRD_HOME/sdcard onto the primary partition of the SD-MMC card.
...
login details are;are:
user: root
password: root
7 Hardware Setup Requirements
7.1 Requirements for the TRD demo setup:
...
Evaluation Kit (Rev 1.0 with production silicon)(board revision 1.0)
AC power adapter (12 VDC)
Optional: A USB Type-A to USB Micro-B cable (for UART communication) and a Tera Term Pro (or similar) UART terminal program.
...
11. Click on Setup -> Serial Port and make sure to setup as shown in below figure.
{teraterm_4.png}
12. User can see the following on theThe serial terminal.terminal displays:
{teraterm_3.png}
...
see the PetalinuxPetaLinux login prompt,
{teraterm_5.png}
9 Run QT GUI application
...
{IMG_20160804_143712.jpg}
9.2.5 FFT Window
UserYou can apply
Window function
None (Default, No windowing)
...
$ cd $TRD_HOME/rpu/swaccel_r5_firmware
$ xsdk -workspace . &
...
displayed as shown in the below figure.shown:
{Soft_Acc_17p4_rpu1.png}
Click 'Import Project' from the welcome screen, browse to the current working directory and make sure the r5FFT, r5FFT_bsp and zcu102_fft_wrapper_hw_platform_0 projects are selected. Click Finish.
...
$ mkdir –p $TRD_HOME/images
$ cp r5FFT/Debug/r5FFT.elf $TRD_HOME/images
10.2 PetalinuxPetaLinux BSP
This
...
using the PetalinuxPetaLinux build tool.
$ cd $TRD_HOME/apu/petalinux_bsp
$ petalinux-config --oldconfig

Zynq UltraScale MPSoC VCU TRD 2018.1 - SDI Design

$
0
0
...
In the Export Hardware Platform for SDK window select "Include bitstream" and click "OK".
{vcu_trd_17p3_fig10.png}
...
created at $TRD_HOME/pl/sdirx_vcu_sditx_proj/zcu106_sdirx_vcu_sditx.sdk/zcu106_vcu_trd_wrapper.hdf$TRD_HOME/pl/sdirx_vcu_sditx_proj/zcu106_sdirx_vcu_sditx.sdk/zcu106_sdirx_vcu_sditx_top.hdf
Similar tp step 5 HDFs for other designs can be generated.HDF paths for other designs are :
pl/sdirx_vcu_sditx_proj/zcu106_sdirx_vcu_sditx.sdk

Zynq UltraScale MPSoC VCU TRD 2018.1 - SDI Design

$
0
0
...
% cd $TRD_HOME/apu/
% petalinux-create -t project -s xilinx-sdi-rx-vcu-sdi-tx-trd-zcu106-zu7-es2-v2018.1-final.bsp
% cd vcu_petalinux_bsp
% petalinux-build
NOTE:

Zynq UltraScale MPSoC VCU TRD 2018.1

$
0
0
...
The HDF is created at $TRD_HOME/pl/vcu_trd_proj/zcu106_vcu_trd.sdk/zcu106_vcu_trd_wrapper.hdf
Similar tp step 5, HDFs for other designs can be generated.HDF paths for other designs are :
$TRD_HOME/pl/sdirx_vcu_sditx_proj/zcu106_sdirx_vcu_sditx.sdk/zcu106_sdirx_vcu_sditx_top
$TRD_HOME/pl/sdirx_vcu_proj/zcu106_sdirx_vcu.sdk/zcu106_sdirx_vcu_sditx_top
$TRD_HOME/pl/hdmirx_vcu_proj/zcu106_hdmirx_vcu.sdk/zcu106_uhdsdi_demo
$TRD_HOME/pl/sdirx_vcu_sditx_proj/zcu106_sdirx_vcu_sditx.sdk/zcu106_sdirx_vcu_sditx_top.hdf
$TRD_HOME/pl/sdirx_vcu_proj/zcu106_sdirx_vcu.sdk/zcu106_sdirx_vcu_sditx_top.hdf
$TRD_HOME/pl/hdmirx_vcu_proj/zcu106_hdmirx_vcu.sdk/zcu106_hdmirx_vcu_wrapper.hdf

x5 Tutorials-5.2 Build Flow-5.2.2 VCU PetaLinux BSP5.2.2 VCU PetaLinux BSP
This tutorial shows how to build the Linux image and boot image using the PetaLinux build tool.
Viewing all 11776 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>