Best代写-最专业靠谱代写IT | CS | 留学生作业 | 编程代写Java | Python |C/C++ | PHP | Matlab | Assignment Project Homework代写

Matlab代写 | EN3062 coursework (2019-2020)

Matlab代写 | EN3062 coursework (2019-2020)


EN3062 coursework (2019-2020)
Ze Ji
November 22, 2019
This coursework is worth 20% of the total marks. There are 4 questions
in this coursework. Q1/Q2 are worth 50% and Q3/Q4 are worth 50% of the
whole coursework.
ˆ Deadline: 12pm Monday Week 11
ˆ Coursework should be submitted via learning central electronically
ˆ A written report (in pdf or word) and corresponding Matlab les are
ˆ Matlab les should be submitted separately as a zip le. They should
be placed in one folder which is to be compressed in one zip le, named
with your student number, such as, where xxxxxxx is your
student number.
ˆ Q3 and Q4 should be completed using the function templates given.
Q1 and Q2 can be completed as two Matlab scripts without using a
function template.
ˆ DO NOT COMPRESS THE REPORT TOGETHER WITH THE MATLAB FILES, as the report needs to be marked online.
ˆ For questions 3 and 4, in the Matlab les, complete all sections, including your name in the comment section. For example:
% student_name: First_Name Other_Names Family_Name
% student_number: xxxxxxx
function [ret] = fiducialprojection(X, Y, X, focallength)
% X, Y, Z: object coordinate in the world frame
% focallength: camera focal length
% ret: the projected coordinate of [X,Y,Z] on the image plane
ret = [0, 0]; % initial value [0, 0]
return ret
1 Question 1
The Denavit-Hartenberg parameters of a robot is given below:
links θ d a α
1 q1 L1 0 90
2 q2 0 L2 0
3 q3 0 L3 0
4 q4 0 0 90
5 q5 L5 0 0
where L1 = 10.25, L2 = 9, L3 = 9, and L5 = 6.25.
ˆ Sketch the robot kinematics and derive the forward kinematics model
in the report. Calculate the result using standard Matlab matrix operators without the toolbox functions using the following conguration:
q1 = 30
q2 = 45
q3 = −90
q4 = 45
q5 = 60
ˆ Use the Matlab functions of “Revolute” and “Prismatic” or “Link” to
construct all ve links with the parameters given in the table and then
create a robot kinematics model using “SerialLink”. Experiment with
the “teach” method for the robot above.
ˆ Compute the end-eector pose using “fkine” with the same conguration above.
ˆ Now compute the inverse kinematics using “ikine” with the result
above and explain if you can get the result.
ˆ If so, what can you tell from the result?
2 Question 2
Use the built-in robot model, “Puma560”, for the same task above.
>> mdl_puma560 % load the robot model
>> p560 % print out the D-H table
ˆ Display the kinematics model usign the Matlab toolbox.
 Note: It is worth practicing to sketch the robot model using the
D-H paramters and verify your drawing with the Matlab model.
ˆ Compute the end-eector pose using “fkine” with the following conguration:
q1 = 30
q2 = 45
q3 = −30
q4 = 45
q5 = 90
q6 = −45
ˆ Now compute the inverse kinematics using “ikine” with the result
above and explain if you can get the result.
ˆ If so, what can you tell from the result?
3 Question 3
Figure 1 shows a typical visual servoing conguration for industrial robots.
It is known as eye-in-hand conguration that a camera is mounted at the
end-eector of the robot. It is a usual practice to use a checker board for
camera calibration.
This coursework is a simplied case of this problem. Instead of a checker
board, we use a ducial pattern in this case. The relative pose between the
workbench and the camera needs to be calibrated rst using the ducial
pattern at a known position.
In this coursework, we assume the initial position for the robot camera
is the origin (0, 0, 0). The positive z-axis is the camera’s optical axis which
Figure 1: Visual Servo robot (Eye-in-hand conguration)
Figure 2: The ducial image
Figure 3: The ducial image
intersects with the workbench surface at position D (as depicted in gure 3).
The focal length of the camera is 7mm.
The workbench surface is parallel to the xy-plane and perpendicular to
the z-axis. The distance from the surface to the camera is 500 mm.
3.1 Task:
ˆ Write a Matlab script that can compute the image-plane coordinate of
a point on the workbench surface. The function template is provided.
Complete the Matlab template ducialprojection.m.
ˆ By hand, work out the expected coordinates of the vertices (A, B, C,
D, E, F, G) projected on the image-plane coordinate space. Tip: First
construct the camera matrix and the coordinates of the corresponding
vertices in the 3D space with respect to the camera frame.
ˆ Verify the results of the hand calculated results against the results
obtained from the Matlab function.
ˆ Similar to questions above, assuming the camera is moved along the
x-axis with a distance of 30 mm and rotated about the y-axis with
an angle of 0.3 (radian). Complete the Matlab script ducialprojection2.m to compute the image-plane coordinates of the vertices on the
workbench surface.
4 Question 4
Figure 4 (original le can be downloaded from learning central named q2.jpg)
shows an image captured by the camera.
Figure 4: The real image taken by a camera
4.1 Task:
ˆ Describe the steps to locate the centres of the two black regions R1
and R2.
ˆ Complete the Matlab function blobcentres.m to implement the above
method. The parameter of the function is the lename of the image
le (e.g. q2.jpg) as a string type.
ˆ The Matlab function should return the two centre coordinates in the
image-plane in the format of a 2 × 2 matrix, where each column is a
vector of the coordinate, such as [x1, y1; x2, y2].