Quantcast
Channel: Machine Vision topics
Viewing all 3180 articles
Browse latest View live

Missing Features on NI Vision 2013?

$
0
0

Hi guys, 

I can't seem to figure out why several features are either missing, or disabled in my version of NI Vision Builder 2013.

 

I can't complete the Check for Cap tutorial due to the re position region of interest check box being unavailable.

image.png

 

I have also noticed that some features appear to be totally missing in other places.

The tutorial I'm attempting is located on page 28 of this PDF

http://www.ni.com/pdf/manuals/373379h.pdf

 

Any ideas?


Interface name that points to remote device?

$
0
0

I have a frame grabber (PXIe 1435) installed in a PXIe 1082 chassis, and a photonfocus camera link camera attached. I am using a laptop with labview 2015 to develop and deploy code to the PXIe over Ethernet.

 

 In MAX, I can configure the camera and start pulling frames to my laptop.

 

grab working in MAX.png

 

I can deploy and run the following VI to the PXI and successfully grab frames, and the image is displayed on my laptop.

 

deployableVI.png

 

What I want to do however, is run this code on my laptop, and use a reference name that points to the correct PXIe slot and port. In this case maybe it would be something like "//192.168.1.12::img0::0"

 

Is there a way to do this? It seems like MAX is doing just that, since it is not deploying any code to the PXIe...

 

the reason I ask is because the manufacturer provided a labview toolbox for configuring camera parameters over camera link bus, but it uses activeX. So I can't deploy their code to the PXIe, but if I could run their code on my laptop and just point it out to the PXIe, maybe I could configure camera parameters over the network...  

 

 

Using Basler Camera For Controlling Stepper Motor

$
0
0

Hi Everyone,

i need to know is i Can use basler camera to control a stepper motor without external acquisition board?
in other word, does Basler Camera contain microcontrollers for motor commande?

thanks Smiley Happy

Separating relatively dark slabs from non-uniform background

$
0
0

Hello,

Actually i asked this in an onether entry. But i think it's a new subject. I apologize in advance if it's double entry

https://forums.ni.com/t5/Machine-Vision/Marble-slab/td-p/3708856/page/4

 

I have marble slabs. With line scanning get i image of these marble slabs. In result image background is extremely dark. But background is not completely dark (it's no-uniform). If you look carefully, you can see some parts on background that not total black are (contrast enhanced image_01) . Most marble slabs have some dark areas, too.

I want to find contour of these marble slabs. To do that first i need to separate marble from this background. I use manual thresholding (thresholding bright objects, see attached script). I can't find an auto thresholding methode for this. Manual thresholding doesn't always work for me. Sometimes i have to change lower value for manual thresholding. I don't want this.

 

Is there a better way to that? I think i might use some gray morphology before thresholding. Any ideas?

 

 

VBai error: Invalid Inspection File

$
0
0

Hello,

 

 

I was doing some testing of our system today, and saved this file from another. Everything during the day was hunky dory, but when I went to open the file at home this error keeps persisting. I'm not sure why I cannot open it, and I don't recall making any major changes outside of inspection parameter tweaks. 

Help?


Cheers!

svm in machine vision

$
0
0

Hi

 

Can anyone provide me a basic example VI which does classification using svm present in machine vision under vision and motion? It will be a great help.

 

Yamada

License Manager does not give option to activate my VDM runtime

$
0
0

We have an application deployed on a customer machine. They bought a Vision runtime license for it, as required.

 

We are getting error -1074396157 at the first VDM call after acquisition. Back on LabView, I read that this is caused by not having a licensed NI Vision Runtime.

 

So I go to the NI License manager and it says that the only licensed software on the machine is Vision Acquisition, and it's all fine. There's nothing sitting there waiting for a license.

 

I know that Vision Acquisition and NI VDM Runtime are two different things. MAX says that NI-IMAQdx is installed, including Development Support and Runtime. Yet the NI License Manager doesn't give me the option to apply a license. Also, I'm not getting that modal popunder that shows up behind my application on machines where the license hasn't yet been applied, and blocks all input. Rather, the program runs as normal except for always having that error come up on any VDM call.

 

How do I get the NI License Manager to notice that there's an unlicensed VDM runtime we could apply a license to?

 

VBAI Crashing Controller via TCP/IO Command

$
0
0

Hi There,

 

I've constructed a VBAI program to send a packet (containing a position vector) to a controller, obtain an image at this point, and run an inspection. Unfortunately, sending this packet crashes the controller (as evidence by a separate machine running commands to the controller being disconnected and sent to an indefinite loop of "Connecting...").

I've confirmed that my computer is both connected and communicating with the controller by using "Packet Sender" (nifty little application for such tasks). I can read the Linux console on the controller after sending the packet via "Packet Sender" confirming my packet and the positional data therein. Going back to VBAI and sending the exact packet crashes the controller. I can also ping the controller from my computer.

 

VBAI has the correct port and IP address of the controller. Is there any other step I need to take in order to establish a connection with the device? I'm at my wit's end.

thanks,

 

 

 

vbai file : https://drive.google.com/drive/folders/1dTXmtWPwsRC1vkbNxgGuFiH5TLxnmDuj?usp=sharing


Problem while using Hamamatsu c9100-13 camera with ni-1428 frame grabber.

$
0
0

Hi,

I am trying to operate a Hamamatsu EM-CCD (C9100-13) camera through Labview 15. The camera is connected to a Phoenix Camera Link frame grabber (AS-PHX-B-D24CL-PCI32). I wanted to know which version of Video Capture Library for Labview will allow me to operate the mentioned camera through Labview and from where can I download it? 
The mentioned camera is compatible with only DCAM-API driver version 13.10.4418 (https://dcam-api.com/downloads/#archive).
Are there frame grabbers that have associated Labview drivers? Do you have recommendations for the same? 

Access camera connected to remote target like MAX

$
0
0

So, through MAX, I can expand "Remote Systems", click on my RT target, expand it, look at devices, click on the camera and see the image and change the attributes.

 

Is that behavior possible through a LabVIEW application running on the local target but without altering the RT targets's code. For example, can I put a URL into the IMAQdx Camera Session control?

Problem while using Hamamatsu c9100-13 camera with Phoenix D24CL-PCI32 frame grabber

$
0
0

Hi,

I am trying to operate a Hamamatsu EM-CCD (C9100-13) camera through Labview 15. The camera is connected to a Phoenix Camera Link frame grabber (AS-PHX-B-D24CL-PCI32). I wanted to know which version of Video Capture Library for Labview will allow me to operate the mentioned camera through Labview and from where can I download it? 
The mentioned camera is compatible with only DCAM-API driver version 13.10.4418 (https://dcam-api.com/downloads/#archive).
Are there frame grabbers that have associated Labview drivers? Do you have recommendations for the same? 

How can I attach two images taken from a Basler Racer Line Scan using Vision Builder AI?

$
0
0

On the production line, the image of the profile of a band saw is taken by the Basler Racer camera. When it takes the image using line scan, it will scan the whole length of the but will split the image into two images. I would like to know if there is a way to attach the two photos so that it is a continuous band saw using Vision Builder AI.

 

Originally I use VBAI to look at each separate image and take the data of the profile on at a time. Then I tried to use LabVIEW to combine the two images by creating two arrays of the images based on the pixel location then add the two arrays to create a single image and then called the .vi from VBAI to output the image. This would work but sometimes one of the images from the Basler camera would not go through which would cause an error and not output any image.

 

If there is a simpler way to combine the two images using VBAI or even LabVIEW I would greatly appreciate your input.

 

Thanks!

Instalacion de camara basler acA2500

$
0
0

Como instalo y configuro mi camara basler acA2500 con mi labview instalado en mi computador?

IMAQ triggering to measure pulse light

$
0
0

Hello everyone,

First of all thanks for all the help I got from this forum by reading here 'anonymously'. Unfortunately, this time I have a problem I could not solve by myself so far. In the program I am using a state machine but changed it as can be seen in the snippet to make reading and understanding it easier.

 

I have the camera GL2018R from Sensors Unlimited, the frame grabber NI PCIe-1433 and the NI PCIe-6353 DAQ card. The camera has a 1D single line sensor and can acquire up to 147000 (147 klps) lines per second. The light source generates 1 ns pulses at a rate of 10 kHz (can be adjusted up to 20 kHz). I basically just want to trigger the camera to record one image/spectrum per trigger.  At the moment I am able to trigger, record and read out the spectra but some strange things are happening:

For example, the spectral intensity depends on the trigger (timing, exposure time and width etc), which should not be because the light is only there for 1 ns. It is very unlikely/impossible to integrate more or less of this pulse to change its intensity. The spectrum should either be there when the triggering works or not (when triggered after the light is gone).

 

Recording the signal:

First I recorded and read out the spectra continously but the readout sometimes could not keep up (losing frames) so I am now recording a finite number of spectra where each spectrum is written to a new buffer number (within a ring buffer). Afterwards the buffer is read out and a new acquisition is started when desired, as you can see in the attached snippet.

 

Triggering the acquisition:

A continous output from the DAQ board triggers the light source with 10 kHz, using ctr0. On a second channel a second trigger signal is generated with the same frequency and set to "finite output" using ctr1. The two signals are triggered using ctr3 of the board to make sure they are synchronous. The width and delay can be chosen independently. I checked the output on an oscilloscope and it works just fine and as expected.

 

Problem:

I do not know what the problem is at the moment. Although the triggering works, meaning I get one spectrum for every incoming pulse (and if I do not provide the trigger, no acquisition takes place leading to a timeout error), their intensities depends on the trigger timing. The integration time has a big influence too, which does not follow my expectations. Also, the first acquired spectrum looks normal, the second is just flat with varying intensity and after that there is an alternating intensity. Every other spectrum has a slightly lower intensity.

In the end it looks like it is not triggering correctly somehow.

I read somewhere that one has to be careful not to trigger the camera AND the frame grabber. I do not think I am doing that. I made the settings with MAX and also by sending serial commands while disabling access to the MAX settings - with basically the same result. If the light pulse arrives within the integration time of the camera there should not be a dependence on the exact trigger timing, the integration time and so on...?!

 

If I choose a long integration time and adjust the light pulse so it is in the center of the integration the spectra look all the same, which is good (except for spectra 1 & 2). Still I cannot explain the strange behavior mentioned above and thus do not trust the results

 

Please let me know if you need any additional information.

I would appreciate any help and hints you can give me.

 

Thanks!

IMAQ triggering to measure pulse light

$
0
0

Hello everyone,

First of all thanks for all the help I got from this forum by reading here 'anonymously'. Unfortunately, this time I have a problem I could not solve by myself so far. In the program I am using a state machine but changed it as can be seen in the snippet to make reading and understanding it easier.

 

I have the camera GL2018R from Sensors Unlimited, the frame grabber NI PCIe-1433 and the NI PCIe-6353 DAQ card. The camera has a 1D single line sensor and can acquire up to 147000 (147 klps) lines per second. The light source generates 1 ns pulses at a rate of 10 kHz (can be adjusted up to 20 kHz). I basically just want to trigger the camera to record one image/spectrum per trigger.  At the moment I am able to trigger, record and read out the spectra but some strange things are happening:

For example, the spectral intensity depends on the trigger (timing, exposure time and width etc), which should not be because the light is only there for 1 ns. It is very unlikely/impossible to integrate more or less of this pulse to change its intensity. The spectrum should either be there when the triggering works or not (when triggered after the light is gone).

 

Recording the signal:

First I recorded and read out the spectra continously but the readout sometimes could not keep up (losing frames) so I am now recording a finite number of spectra where each spectrum is written to a new buffer number (within a ring buffer). Afterwards the buffer is read out and a new acquisition is started when desired, as you can see in the attached snippet.

 

Triggering the acquisition:

A continous output from the DAQ board triggers the light source with 10 kHz, using ctr0. On a second channel a second trigger signal is generated with the same frequency and set to "finite output" using ctr1. The two signals are triggered using ctr3 of the board to make sure they are synchronous. The width and delay can be chosen independently. I checked the output on an oscilloscope and it works just fine and as expected.

 

Problem:

I do not know what the problem is at the moment. Although the triggering works, meaning I get one spectrum for every incoming pulse (and if I do not provide the trigger, no acquisition takes place leading to a timeout error), their intensities depends on the trigger timing. The integration time has a big influence too, which does not follow my expectations. Also, the first acquired spectrum looks normal, the second is just flat with varying intensity and after that there is an alternating intensity. Every other spectrum has a slightly lower intensity.

In the end it looks like it is not triggering correctly somehow.

I read somewhere that one has to be careful not to trigger the camera AND the frame grabber. I do not think I am doing that. I made the settings with MAX and also by sending serial commands while disabling access to the MAX settings - with basically the same result. If the light pulse arrives within the integration time of the camera there should not be a dependence on the exact trigger timing, the integration time and so on...?!

 

If I choose a long integration time and adjust the light pulse so it is in the center of the integration the spectra look all the same, which is good (except for spectra 1 & 2). Still I cannot explain the strange behavior mentioned above and thus do not trust the results

 

Please let me know if you need any additional information.

I would appreciate any help and hints you can give me.

 

Thanks!


Broken line(edge) detection from binary image

$
0
0

Hi,

I'm trying to mark(detect) 4 straight lines from a binary image and whether any of them are broken (not straith).
If there are any broken lines, i wish to mark(detect) these lines (edges) with a vertical line (which in this case(Test_Image) is located up in the right side corner).
I have created an application(VI) which seems to work well with the "raw" image, but when I extract the colorplane (blue) and try to detect the lines from the binary image (not the raw image), i can't seem to find anything. Maybe i'm using the wrong tools(VI's) or simply the wrong parameters(to much noise?).

Any suggestions? Smiley Happy

(It works when I use: "First Edge Rake" detection, but then I can only detect one horizontal line)

Example.JPG

Broken line(edge) detection from binary image

$
0
0

Hi,

I'm trying to mark(detect) 4 straight lines from a binary image and whether any of them are broken (not straith).
If there are any broken lines, i wish to mark(detect) these lines (edges) with a vertical line (which in this case(Test_Image) is located up in the right side corner).
I have created an application(VI) which seems to work well with the "raw" image, but when I extract the colorplane (blue) and try to detect the lines from the binary image (not the raw image), i can't seem to find anything. Maybe i'm using the wrong tools(VI's) or simply the wrong parameters(to much noise?).

Any suggestions? Smiley Happy

(It works when I use: "First Edge Rake" detection, but then I can only detect one horizontal line)

Example.JPG

 

image acquisition FGPA C Api and NI-1473R-LX110

$
0
0

Hello NI Community,

I am working on a project that requires image acquisition in a C/C++ application while using the NI-1473R board. Hence I am relying on FGPA C Api generator to access the image data sent to FIFO.

If I generate the header file from the 10-tap 8-bit example file, I see that I have access to Camera Serial port and Acquisition controls and indicators. I can also read from the FIFO, using 'Host DMA U8' VI. 

My questions are:

1- Do I just use a FIFO method function (NiFpga_ConfigureFifo2) with a certain Depth to acquire a single image in the host? Because that is basically done in the HOST VI in the example.

2- I can't seem to find a good example for such purpose? FPGA C Interface comes with simple examples, but I rather have a more in-depth example (related to image processing) to look at.

 

Best,

Mehmet

 

 

Run VBAI in LV. (Error -354703 occurred at VBAI Interface - Open Connection.vi)

$
0
0

 

Hi everyone,

Im trying to run VBAI2015 in LV2015 (Evaluation). I load example <NI>VBAI 2011/API examples/LabVIEW. Error 354703 appears when I "Launch Vision Builder AI Engine". I try to create simple prog similar to the example, same error. I attached files for review.

*Im new to LabView & I have ambition to build an Inspection Machine by Vision using LabVIEW.

Thanks,
dane

 

unknown.jpg

 

 

 

 

How to match Pattern Matching when the work's position and angle varies and background disturbed?

$
0
0

Hi~

 

I don't know how can I find the pattern using IMAQ Match Pattern 4 algorithm

when the work's position and angle varies and background disturbed.

 

I attach the sample images.

 

If anyone know about this issue, please help us.

Thanks.

Viewing all 3180 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>