{"id":117122,"date":"2024-10-07T15:00:25","date_gmt":"2024-10-07T07:00:25","guid":{"rendered":"https:\/\/www.tm-robot.com\/?post_type=docs&p=117122"},"modified":"2024-10-07T18:35:13","modified_gmt":"2024-10-07T10:35:13","slug":"optimize-your-object-handling-with-upward-looking-camera","status":"publish","type":"docs","link":"https:\/\/www.tm-robot.com.cn\/ko\/docs\/optimize-your-object-handling-with-upward-looking-camera\/","title":{"rendered":"Optimize Your Object Handling with Upward-Looking Camera"},"content":{"rendered":"
Examples are valid for:
\nTMflow Software version:\u00a0 All.
\nTM Robot Hardware version: All
\nOther specific requirements:<\/p>\n
Note that older or newer software versions may have different results.<\/p>\n
In pick and place operations, common industrial processes typically use robots to perform pick and place operations, starting from a picking position and moving to a placing position. This methods might require precise positoning when teaching the target points and frequent recalibration, which sometimes can be time-consuming, especially when dealing with tiny object or target point is difficult to reach.<\/p>\n
But have you ever wondered if you achieve the same results by reversing the sequence? \u2014 setting up and placing the workpiece first, followed by picking it up for vision inspection, and adjusting based on the visual data for any positional discrepancies before final positioning.<\/p>\n
Pick-and-place operations at specific locations are best achieved using an external upward-looking camera with position compensation and tool shift. This method helps reduce the need for frequent teaching, saving time by allowing the tool to adjust its position accurately at the target point without requiring additional manual intervention.<\/p>\n In traditional pick-and-place systems, achieving precision when the target position is in a difficult or obstructed location can be challenging. For example, when a robot arm needs to place an object inside a small cavity, angled surface, or into a confined space where direct line-of-sight is limited, traditional methods may struggle with accuracy. These hard-to-reach areas may also require specific orientations for the object to fit properly. Using upward-looking camera with tool shift functionality might overcomes the challenges by providing real-time positional data, allowing for precise adjustments even when the target is not easily visible. This eliminates the need for multiple teaching sessions and enhances accuracy in complex environments.<\/p>\n When dealing with tiny objects, such as micro-components in electronics manufacturing or small mechanical parts, the margin for error is extremely small. Traditional pick-and-place methods that rely on pre-recorded points may not achieve the high level of precision required, especially if the object must be placed within a tight tolerance range (e.g., less than 0.1 mm). In these scenarios, an upward-looking camera combined with tool shift can improve precision by detecting and fixing position errors after picking up the object to ensure the object is perfectly aligned before placement.<\/p>\n In environments where small batches of diverse products are common such as custom manufacturing, manual teaching for each product can consume significant time. Traditional pick-and-place systems might require individual setup for each object type, but by utilizing an upward-looking camera with tool shift, you can reduce setup time significantly. This method enables a single teaching session to adapt to multiple objects, streamlining production for small-batch, high-variety environments. The ability to rapidly shift between different products minimizes downtime and maximizes efficiency.<\/p>\n Human error is often a major factor in inconsistent quality during manual pick-and-place tasks, particularly when precision is required. Manual labor causing variability in positioning, which can result in misaligned or defective placements. By automating the process with an upward-looking camera and tool shift, this method ensures consistent accuracy and precision, reducing the risk of defects caused by misalignment. The system can self-correct and adjust on the fly, offering a stable and repeatable process that enhances product quality, especially for tasks that demand high precision.<\/p>\n Real Case Example:<\/strong> In semiconductor manufacturing, placing tiny chips on a board with high accuracy is critical. The upward-looking camera enables the robot to adjust for any micro-level misalignments and achieve placement with minimal error.<\/p>\n Note : Make sure a workspace has been created for Upward Looking \u2013 Position Compensation before starting this project.<\/strong><\/span><\/p>\n 2.\u00a0Bring the object with the tool and create an upward-looking Vision Job, referred to as \u201cUPL<\/span>\u201d in this example.<\/p>\n Then, [save as]<\/span> another Vision Job based on this one, called \u2018UPL1<\/span>\u2019 in this example.<\/p>\n (Note that both vision jobs should be created within the same vision job to ensure alignment of features and reference points, so the compensated points will remain accurate after the tool shift. )<\/p>\n 3. Copy the target position point from Step 1 and overwrite the tool<\/u><\/span> to use the TCP generated by the first vision job \u2018UPL\u2019 V1<\/span> in this example.<\/p>\n 4. Move your object to the initial position (picking point), then drag your robot to the picking point and record the point.<\/p>\n 5. Drag a vision node and select \u2018UPL1<\/span>\u2019 as the Vision Job in this example.<\/p>\n 6. Copy<\/span> the target point created in step 3<\/span>. Then click on the pencil icon to edit the point on TMflow, select [TOOL SHIFT<\/strong> to vision TCP_UPL1<\/span> in this example] and choose [KEEP PATH<\/strong><\/span>]<\/p>\n
<\/a><\/p>\nOptimal Scenario for Applying This Method<\/h1>\n
<\/a><\/p>\n1. Difficult Target Locations<\/h2>\n
2. Tiny Objects or Small Tolerance<\/h2>\n
3. Fast Speed : Small Batch, High Variety Production<\/h2>\n
4. Quality : Reducing Human Error in Precision Tasks<\/h2>\n
Process Workflow<\/h1>\n
<\/a><\/p>\n
<\/a><\/p>\nTEACH THE TARGET POSITION :\u00a0<\/span><\/h2>\n
\n
<\/a><\/p>\n
<\/a><\/p>\n
<\/a><\/p>\nTEACH THE INITIAL POSITION :\u00a0<\/span><\/h2>\n
<\/a><\/p>\n
<\/a><\/p>\n
<\/p>\n