This problem is as straighforward as you'll ever going to see. You know that speed is equal to distance divided by time
#color(blue)(v = d/t)#
which can only mean that time will be
#v = d/t implies t = d/v#
The distance is equal to
#t = (1color(red)(cancel(color(black)("m"))))/(5 * 10^5color(red)(cancel(color(black)("m")))/"s") = 1/5 * 10^(-5) = 0.2 * 10^(-5) = color(green)(2 * 10^(-6)"s")#
The speed of the electron 500,000 meters per second (or about 1 million miles per hour) seems very fast. One might ask if relativistic corrections are needed. But it turns out that this speed is only 0.16% of the speed of light. Generally, one does not worry about relativistic corrections until the speed reaches about 10% of the speed of light.
In the frame of reference of the electron, one can calculate the length contraction of the meter stick and show that the electron will see a meter which is almost the same size as the usual meter.
Taking c to be 1 and v to be 0.0016:
The apparent length traveled differs by just about one micrometer. This isn't enough to worry about for most practical purposes. But it is correct to say that in the frame of reference of the electron, it will cross your meter stick slightly faster than you observe.
Average speed = distance travelled/time taken