vkmast wrote on 8/10/2012, 6:06 AM
Still, like they say elsewhere, past performance does not guarantee future results.
pilsburypie wrote on 8/10/2012, 6:47 AM
True, but what I was after was the time between when people were asked to first beta test 11 pro till when it came out. Just for a rough idea when 12 pro may emerge.

Kind of worries me that 12 may be round the corner when I'm hoping for a final 11 fix. Could SCS now have written 11 off?
donaldh606 wrote on 8/10/2012, 9:57 AM
I too am worried that 11 may not see a stable version. Am I out of line thinking that everyone who upgraded or purchased 11 should see a FREE upgrade to 12 when it's released? I've lost valuable time restarting projects due to crashes and feel that I should receive some type of compensation. I would just like to see a reliable version release.
Chienworks wrote on 8/10/2012, 10:04 AM
You probably won't get an answer. Those who are asked to beta test agree to an NDA, which, among other things, prevents them from even saying they are testers.

As always with beta testing: those who know don't talk, and those who talk don't know.
pilsburypie wrote on 8/10/2012, 10:11 AM
Ha! A free upgrade to 12? Now that would be nice, but I somehow doubt it! Personally I feel the same as you and think it is the least Sony can do in the absence of a final 11 fix.

Software is a funny beast - If I bought a vacuum cleaner, car, watch - anything physical and it had the issues many seem to have with Vegas 11 i.e. not working as promised then you can take it back for a refund no problem. There are also hundreds of makes and models to choose from when you then decide to buy another. There just doesn't seem the same rights with software or the choice hence why so many are just battling on with it which I suppose legally you are accepting it......

I'm still after anyone chipping in who beta tested 11 via invite and the time delay before release.

(awaits someone telling me they have no problem with Vegas 11 pro and it must be me or my PC!)
pilsburypie wrote on 8/10/2012, 10:13 AM
Aaahhh, James Bond, need to know!

ritsmer wrote on 8/10/2012, 10:35 AM
I'm still after anyone chipping in who beta tested 11 via invite and the time delay before release.

As Chienworks said: You probably won't get an answer. Those who are asked to beta test agree to an NDA, which, among other things, prevents them from even saying they are testers.

Add that said NDA is AFAIK quite extensive, it is time-unlimited (!) and it is non-terminable (!!) - so do not expect an answer to your question.
Byron K wrote on 8/11/2012, 3:45 AM
One way i'v observed when the next version of Pro is coming is the release of the new Studio first, the new Pro version usually follows 4-6 months after.
pilsburypie wrote on 8/11/2012, 5:50 AM
Thanks Byron. What about people who got an invite to beta test 11 pro but chose not to and didn't sign the NDA. They could surely answer the question.
PeterDuke wrote on 8/11/2012, 6:13 AM
No sign NDA, no inside info.
pilsburypie wrote on 8/12/2012, 3:47 AM
Yeah I get this but someone must have received a 11 pro beta test invite and chosen not to test and not to sign the NDA. These are the people who can tell me the timescales.
PeterDuke wrote on 8/12/2012, 5:33 AM
I received the invite but no timescales were disclosed. The invitation was just that, an invitation to take part, nothing more.
PeterDuke wrote on 8/12/2012, 5:39 AM
Apparently not everybody known to SCS received an invitation for beta testing, so I got to wondering how they select. Perhaps it is the number of post to the forum, without necessarily checking the quality of those posts. :)
pilsburypie wrote on 8/12/2012, 7:13 AM
Sure no timescales are given but I'm not sure if people are getting what I'm after. Probably because I've not been clear.

People have just been invited to beta test 12. Some have chosen to accept and sign a NDA. Some have not. To try and guess when 12 will be released I'm after someone who was invited to beta test 11 and chose not to, hence signed no NDA to let us know the time gap between their invite and the release of 11.

Just for interest really
JJKizak wrote on 8/12/2012, 7:32 AM
It would seem to me by now that software programmers would have a master configuration of automatic testing to run their software through a thorough test procedure to get the bugs out. In other words install the software then press the test button and watch the final printout for all the bad things that happened.
Dan Sherman wrote on 8/12/2012, 9:21 AM
Vegas Pro 12 will be released when it is released.
Why does it have to be a guessing game?
Paul Masters wrote on 8/12/2012, 10:10 AM

As I recall, a number of years ago, Sony indicated that the updates come about every 18 to 24 months. As 11 has been out about 12, I guess 12 would be out in about 6 - 12 months. However, programming being what it is, there is no easy way to tell. Release may also depend on marketing timing.

I was a mainframe systems programmer for over 40 years.

There were / are products for testing. However, they can be very problematic. That is, they are script driven. The script can be created in a 'learning' mode - that is, first you run through the application and the product records what you did. Later, you have the test product play back the script. The problem is, if there is a change in what that activity does in the program you are testing, the test product doesn't know that that is intentional and flags it as a variant. Once changes are implemented in the program, the script has to be regenerated, sometimes in it's entirety.

That was with text based applications. I can't imagine how graphic based programs, such as Vegas could be tested that way. But then there are people working on PCs that are much smarter about them than I am. In any case, they likely have a manual script that they can run. That would make testing quicker and more consistent.

Paul Masters

PS: Even if you are a beta tester, I have been for various products over the years, they still don't tell you anything.
videoITguy wrote on 8/12/2012, 1:45 PM
Paul, your account of text based auditing of mainframe programming is interesting. I don't pretend to know how SCS conducts their beta tests and in fact there are many people on this forum who do have knowledge of more recent test /beta processes in modern software.

I actually don't believe they are doing anything nearly as sophisticated as one would believe. In fact I should be proven 100% wrong about that! but...

Like your mainframe process, there are at hand programming methods that can be used to stress test visual interfaces. But don't think SCS goes that far.

A more free-form approach that I have seen, is just hand-out the beta program to experienced users and let them play it. Problem with this approach is that the evaluations can hardly be correlated to a tangible finding. A more disciplined style to testing is really needed. But don't think that SCS goes that far either.

I do hope that a former Beta tester of SCS product will come forward one day and state in anonymous signature how they were asked to test. And again I don't think that will happen!
Paul Masters wrote on 8/13/2012, 10:28 AM
Testing is quite an interesting problem.

The problem is that the people who write the programs / applications, 'know' what you are supposed to do and not do.
The problem is that the people who use the programs / applications usually don't.

An example: Many years ago, a person in our group asked me to test his program. He said he had worked hard on it. He didn't think I could find any 'bugs'.
I started the program and pressed Enter. The program crashed.
He said "you're not supposed to do that!" I told him, well, I did, and others may do so as well. (I always did the same when testing my programs.) A simple check of each field to see if they were blank and an error message if that was not allowed would fix that problem.

People do all kinds of things. "Let's see what will happen if I do this." Or entering a value higher than the one allowed because that's what they want but don't know what the maximum value is.

When we got an update of a utility, I would usually go through the vendors demo / tutorial. Sometimes I would find problems even if I follow the instructions exactly - because my interpretation of 'exactly' may not be theirs.

So having a script to follow can be useful. First it can provide a consistency in various environments. PC hardware and software mix are considerably more varied than on a mainframe. And, the minute variations each person unknowingly makes to the instructions can and likely will cause problems.
The hard part is describing 'exactly' what you did to cause the problem. Dumps are a great help, but if the programmer doesn't understand how you got there, it can be difficult to fix or even find what caused the problem.

Paul Masters
JohnnyRoy wrote on 8/13/2012, 12:16 PM
> "I do hope that a former Beta tester of SCS product will come forward one day and state in anonymous signature how they were asked to test. And again I don't think that will happen!"

I can tell you that no company worth it's salt would "tell" a beta tester "how" to test. They may say, please check out these new functions and give us feedback, but the whole purpose of a beta test is to see how customers will react to a program. Since companies don't tell customers how to use the program, they should not tell beta testers how to test the program. A good beta tester just uses the program for their normal workflow and reports any errors they encounter. That's why it's important to get a wide variety of beta testers with different workflows on a wide variety of platforms.

As Paul pointed out... you want the beta tester to do things that you would never think of doing like starting the program and pressing enter when you haven't done anything to see if the program handles it gracefully.

Beta testing is NOT a replacement for thorough in-house testing / regression testing. That's when you tell a tester "how" to test in order to get complete code coverage. Beta testers should act like any customer off the street.

videoITguy wrote on 8/13/2012, 12:21 PM
+Paul Masters:
This is why in any beta testing environment that I take part in for visual interface software environments - we use a companion screen recorder and separate narrative track to document user inter-action with the GUI. This, in combination with data dumps and time stamps provides complete logging.

The beta tester narrates in a deliberate and thorough manner what they intend to do, they execute action step by step with the controls (mouse, tablet,etc.), and the screen capture documents success OR otherwise.

I agree totally with Johnny Roy, the problem lies in the communication tools that the field "Beta" tester can give back to the company.
JJKizak wrote on 8/13/2012, 12:32 PM
I kind of think the one of the testers have as many sporatic fingers and thought patterns as a 3 year old kid with a very high speed camera filming all the actions with comparison afterwards. Then you can insert a "frivolous" anticipation software" that sorts it all out kind of like an anti virus appl;ication. We can call it "Sporadic Anticipation Sofware" and sell it on the side.