Recently, robotic-assisted surgery (RAS) has made inroads into many operating rooms, offering multiple benefits for patients. However, for the surgeon operating the robot, there are conditions that affect the workflow. In the particular case of automated suturing, most publications require the participation of a user to determine the targets for moving the robot. This paper presents a system for automatic generation of needle entry positions for robot-assisted suturing. For this case, the segmentation of wounds on the skin by training a Unet network robust to changes in shape, illumination and hue of the surrounding skin is considered. The generation of entry and exit points of a needle on the wound is also considered. Wound point cloud analysis is implemented in conjunction with a control system for a Universal Robots UR3 robot, which marks the needle entry points on a suture training pad. Various conditions of edge separation, wound plane and wound shape are considered in the experiments. The results show optimal performance and good precision according to surgical standards in the different types of wounds analyzed.