本次加拿大代写是一个Matlab计算机视觉3D重建的assignment

## 1. Instructions

Most instructions are the same as before. Here we only describe different points.

- Generate a zip package, and upload to canvas. Also upload the pdf with the name {SFUID}.pdf. The package must contain the following in the following layout:
- {SFUID}
- matlab
- m
- m
- m (Section 3.1.1)
- m (Section 3.1.2)
- m
- m (Section 3.1.3)
- m (Section 3.2.3)
- m (Section 3.2.2)
- m
- m
- m (Section 3.2.1)
- m
- m
- m
- m (Section 3.1.5)
- m (Section 3.1.4)
- m
- Any other helper functions you need

- ec
- m
- m
- m
- m

- result (You likely won’t need this directory. If you have any results which you cannot include in the write-up but want to refer, please use.)

- matlab

- {SFUID}

- File paths: Make sure that any file paths that you use are relative and not absolute so that we can easily run code on our end. For instance, you cannot write “imread(‘/some/absolute/path/data/abc.jpg’)”. Write “imread(‘../data/abc.jpg’)” instead.
- Project 3 has 22 pts.

## 2. Overview

One of the major areas of computer vision is 3D reconstruction. Given several 2D images of an environment, can we recover the 3D structure of the environment, as well as the position of the camera/robot? This has many uses in robotics and autonomous systems, as understanding the 3D structure of the environment is crucial to navigation. You don’t want your robot constantly bumping into walls, or running over human beings!

In this assignment, there are two programming parts: sparse reconstruction and dense reconstruction. Sparse reconstructions generally contain a number of points, but still manage to describe the objects in question. Dense reconstructions are detailed and fine-grained. In fields like 3D modelling and graphics, extremely accurate dense reconstructions are invaluable when generating 3D models of real world objects and scenes.

In part 1, you will write a set of functions to generate a sparse point cloud for some test images we have provided to you. The test images are 2 renderings of a temple from two different angles. We have also provided you with a mat file containing good point correspondences between the two images. You will first write a function that computes the fundamental matrix between the two images. Then you will write a function that uses the epipolar constraint to find more point matches between the two images. Finally, you will write a function that will triangulate the 3D points for each pair of 2D point correspondences.

We have provided a few helpful mat files. someCorresps.mat contains good point correspondences. You will use this to compute the Fundamental matrix. Intrinsics.mat contains the intrinsic camera matrices, which you will need to compute the full camera projection matrices. Finally, templeCoords.mat contains some points on the first image that should be easy to localize in the second image.

In Part 2, we utilize the extrinsic parameters computed by Part 1 to further achieve dense 3D reconstruction of this temple. You will need to compute the rectification parameters. We have provided you with testRectify.m (and some helper functions) that will use your rectification function to warp the stereo pair. You will then use the warped pair to compute a disparity map and finally a dense depth map.

In both cases, multiple images are required, because without two images with a large portion overlapping, the problem is mathematically underspecified. It is for this same reason biologists suppose that humans, and other predatory animals such as eagles and dogs, have two front facing eyes. Hunters need to be able to discern depth when chasing their prey. On the other hand herbivores, such as deer and squirrels, have their eyes positioned on the sides of their heads, sacrificing most of their depth perception for a larger field of view. The whole problem of 3D reconstruction is inspired by the fact that humans and many other animals rely on depth perception when navigating and interacting with their environment. Giving autonomous systems this information is very useful.

## 3. Tasks

### 3.1 Sparse reconstruction

In this section, you will write a set of functions to compute the sparse reconstruction from two sample images of a temple. You will first estimate the Fundamental matrix, compute point correspondences, then plot the results in 3D.

It may be helpful to read through Section 3.1.5 right now. In Section 3.1.5 we ask you to write a testing script that will run your whole pipeline. It will be easier to start that now and add to it as you complete each of the questions one after the other.

### 3.1.1 Implement the eight point algorithm (2 pts)

You will use the eight point algorithm to estimate the fundamental matrix. Please use the point correspondences provided in someCorresp.mat. Write a function with the following form:

function F = eightpoint(pts1, pts2, M)

X1 and x2 are Nx2 matrices corresponding to the (x, y) coordinates of the N points in the first and second image respectively. M is a scale parameter.

- Normalize points and un-normalize F: You should scale the data by dividing each coordinate by M (the maximum of the image’s width and height). After computing F, you will have to “unscale” the fundamental matrix. Note that you could subtract the mean coordinate then divide by the standard deviation for rescaling for better results. For this, you can do a simple scaling (w/o subtraction).
- You must enforce the rank 2 constraint on F before unscaling. Recall that a valid fundamental matrix F will have all epipolar lines intersect at a certain point, meaning that there exists a non-trivial null space for F. In general, with real points, the eightpoint solution for F will not come with this condition. To enforce the rank 2 condition, decompose F with SVD to get the three matrices U, Σ, V such that F = UΣVT . Then force the matrix to be rank 2 by setting the smallest singular value in Σ to zero, giving you a new Σ′. Now compute the proper fundamental matrix with F′ = UΣ′VT .
- You may find it helpful to refine the solution by using local minimization. This probably won’t fix a completely broken solution, but may make a good solution better by locally minimizing a geometric cost function. For this we have provided refineF.m (takes in Fundamental matrix and the two sets of points), which you can call from eightpoint before unscaling F. This function uses matlab’s fminsearch to non-linearly search for a better F that minimizes the cost function. For this to work, it needs an initial guess for F that is already close to the minimum.
- Remember that the x-coordinate of a point in the image is its column entry and y- coordinate is the row entry. Also note that eight-point is just a figurative name, it just means that you need at least 8 points; your algorithm should use an overdetermined system (N > 8 points).

To test your estimated F, use the provided function displayEpipolarF.m (takes in F and the two images). This GUI lets you select a point in one of the images and visualize the corresponding epipolar line in the other image like in the figure below.