Miller Electric

Welding Discussion Forums

Home » Resources » Communities » Welding Discussion Forums
 
Miller Welding Discussion Forums - Powered by vBulletin

Results 1 to 8 of 8
  1. #1

    Red face Calibration Old versus New

    Our company has calibrated it's welders in house for the past 10 years. We use a Miller load bank with a Miller digimeter shunt. Mostly mig welders are the heart of our operation about 150 of them. The welders are rated at 300 amps max but we seldom run them that high. We change out welders and replace them every few years. We have a new model in house the past couple of years and they were on the low side of our standard we have always calibrated to, but still passed. We calibrate them to + or - 1 volt and + or - 3% on amps between 18 to 23 volts. We clean and calibrate annually and this last time about 5 % of the new welders we out of our specs. I won't state the company's name but they are telling us now we are going to have to change our standard. That concerns us. We have always set the load bank on 350 amps and now they want us to set it at 150 amps to get the correct amp and volt readings on the newer style welder.

    Our contention is that the load bank is the standard we have used with success to assure our product quality at 350 without any problems until this new style welder. We are not willing to compromise our product line. What's your opinion on this situation. Could it be the new electronics are smarter than the load bank?

    Thanks!
    Bill

  2. #2
    Join Date
    Feb 2004
    Location
    Boulder, Colorado
    Posts
    483

    Default

    Bill,

    I believe the problem is the thought that the load bank is a critical part of the calibration proceedure. Its not, and neither it its setting - its just a load. Welders con be calibrated with an arc or a load bank - either with its just a load.

    If you are just calibrating CV mig welders, you have three elements that can be calibrated:

    Commanded output voltage
    Measured Output voltage (feedback to closed loop & panel meter indication)
    Measured Output current (just panel meter indication)

    During these measurements, you need to ask yourself if you are adjusting a part of the closed loop process, or just the display meters.

    What model of equipment are you calibrating?

  3. #3

    Red face Electrikmech

    Sorry for the late reply I have been waiting for the admistrator to clarify if I could post the brand name other than Miller in this forum. I have not recieved a reply so here goes. The welders in question are Themal-Arc MST 300's and 400's. We have had their LM300 model in the past and calibrated them with the Miller load bank preset at 350 on the amps. To make a long story short about 10% of the MST's won't pass after 2 years of use. They do have an adjustment on the control board to tweak and we tweaked them to get them to pass when they were new. We have to document everything under ISO and the welders are set to a specific voltage in each of our operations. Thermal-Arc wants us to change our calibration settings to get the MST's to pass now. Specifically they want to change the load bank preset from 350 amps to 150 amps. That means we have two different procedures for each of the model welders doing the same weld. We don't know what to think at this point. We cannot compromise our production welds. What's your opinion.

    Thanks,
    Bill

  4. #4
    Join Date
    Nov 2010
    Location
    Northern Adirondack Mountains
    Posts
    7

    Default Rethinking Your Quality Control Chain

    First, Bill makes some very good points.

    There is a lot about your situation we do not know. I ran an Avionics and Aviation Calibration program for many years and am not fully familiar with your equipment. Being ISO, I assume your load and meter are calibrated and traceable to NIST. Is it possible that your calibration equipment could be wrong and out of calibration?

    Your whole Quality Control chain has been altered by the new equipment. Can you really expect new equipment to deliver the same results using setup parameters for a different machine? It would be convenient if they did.

    The bottom line purpose of your calibration program is to assure the repeatability of your welds. You do the load bank testing with the expectation that Quality Control finds the specified weld when the finished product comes off the assembly line.

    Since the new equipment requires a new calibration procedure. You may have to start from scratch with the new machines to see which settings give the required results as specified by your engineers. Then from there you need to make sure your calibration program maintains those parameters for that machine. New machines may deliver different welds using the same old parameters.

    What is important is maintaining the repeatability of the specified weld and creating a quality assurance procedure to maintain the repeatability.

    Other thoughts:
    Does the welding equipment measure the parameters at the output or do the meters just measure some setup parameters that are only indirectly related to the actual output?

    Wish You the best. Please post how you resolved this.

  5. #5

    Red face Update

    John, our load bank and shunt are under our ISO and are calibrated and traceable to NIST. We just went through the annual calibration on our test equipment. Our welders are tested at several points along the voltage settings and we have corresponding expected amp reading at each setting. For example with our production lines under actual welding conditions we see 18V setting producing 130 amps. With our load bank set at 350 amps we see that 18V shows around 130 amps also. That is why we have used the 350 amp load bank setting as it duplicates our welding conditions on the production line. The manufacturer of the inverter gives a 5% deviation on there specs which is about the same as our +or- 1 volt from 18V to 23V.

    After doing some experimentation I can drastically change the accuracy of the inverter volt meter by the load I draw on the load bank.

    Load bank setting, inverter volt set, voltage at the lugs, output amps

    150 18V 19.2 60
    350 18V 17.2 125
    600 18V 15.9 197

    That is the problem in our process, the 350 setting duplicates our conditions and the inverter volt meter is inaccurate at that setting and does not have the capability to be adjusted. I can adjust the inverter up ignoring the voltage to 130 amps and I have 18V at the lugs, but my inverter setting on volts is over the upper ISO limit.

    We do not want to have special settings for this inverter if we can avoid it because we still have a large quanity of the older model in service and they get swapped sometimes and with three shifts running them it's asking for trouble.

    We are meeting next week to decide our next step.

    Thanks for all your replies. I'll keep this updated.

    Bill

  6. #6
    Join Date
    Nov 2010
    Location
    Northern Adirondack Mountains
    Posts
    7

    Default Commanded Voltage vs. Actual Output Voltage

    Bill

    Thank You for taking the time to update us. This is a learning experience for me.

    I pulled up the manual for the 300 MST and took a look at the schematics. Also looked at some load banks as I have never used them with welders.

    Some items to consider:

    I do know that wire resistive loads are going to show different resistances depending on the temperature. Put an amp meter on your toaster and watch the current drop as the element heats up. This is probably the reason the loads are stepped rather than one single resistive load. A small current would never be able to heat up a single large load to produce the proper resistance.

    To clarify what Bill said. Are you trying to calibrate the "Commanded Voltage" setting or the actual output Voltage. A lot can happen between the "digital pointer" on the knob and what actually comes out of the machine.

    If you are trying to calibrate the Commanded Voltage reading, there is no guarantee that you ever had the specified Voltage at the output. What you may have been able to guarantee would have been consistency amongst all similar machines.

    Looking at the schematic, all I see is a Hall effect loop on the output to measure current. The digital panel meter goes into a printed circuit board somewhere which leads me to think that it is only reading out a Commanded Voltage and not actual voltage.

    My opinion at this point; Measure the Voltage with a separate meter directly at the output. Measure the Current directly at the output with a Hall Effect Amp Meter. Take your measurements while doing a production run and then see if these readings correlate to the Commanded Voltage. Do the same with your new machines. You might end up making a correction card for each machine. The only QC issue you would then need to be concerned with would be the repeatability and stability of each machine. You might find that you would not have to purchase new machines just because the commanded voltage does not match the output voltage. It would be good to eventually learn which components are aging in the machines and if the aging components are acceptable.

    After you have a full understanding of what is going on, then rewrite your ISO procedure.

    Wish You the Best
    John

  7. #7

    Red face Update

    I finally out of frustration and getting no results. I took matters in my own hands and removed the adjusting pot on the mig control board, the resistor feeding it and determined what changes I needed to make to call up the correct comand voltage. After a few tests with my meter I needed a 1 ohm change. A trip to Radio Shack, a little solder and some silicone sealant I was ready to test my theory. A run on the load bank, a few turns of the pot on the board and we are now within our specs, as matter of fact I can dial it in exactly with the pot at mid point. We have forwarded my fix on to the company and so far no reply. I have redone 4 units total with the same results and now have them running in our most used stations.

    Bill

  8. #8
    Join Date
    Oct 2004
    Location
    Edmonton, Alberta
    Posts
    7,905

    Default

    OK, from one who does calibration on welders, first we see if the machine can put out to data plate specs, if it can't do it within a few percentage points, we deem it to be a failed machine.

    Second, while we use a much better Lincoln load bank, It's simply not trusted, instead we use a couple of fluke 87's, one for voltage, the other paired with a fluke 1010I amp meter. these meters are truly calibrated by an independant firm. Often on a high tech machine we will scope the output and compare it to the brands set wave forms at a given volt/ amperage graph.

    Once the machine can put out to its data plate specifications then we match the fluke meter readings with the actual gages.

    Not all machines can be adjusted, so we give them 5% deviant. Some machines like many Lincolns are calibrated via a laptop, load and meters to verify.

    Most stand alone wire feeders like say a millermatic 250 cannot be calibrated either can say most stick machines without gages. That is why loading to data plate specifications is ultra important.

    Realistically, a machine can say one thing, 25' or more from that machine will say something a little different. types of control cables, types of arc cables, sense leads and rated power, all play into the final weld. As long as you can say that the machine performs to factory specification, that is all that any inspector should be concerned with.
    Last edited by cruizer; 01-19-2011 at 06:37 PM.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  

Warning: Function split() is deprecated in /mnt/stor3-wc1-dfw1/357822/357839/www.millerwelds.com/web/content/lib/footer.inc.php on line 82

Welding Projects

Special Offers: See the latest Miller deals and promotions.