Hello!
I've been watching wine a bit recently and there's been a lot of activity (code wise) in new features (backpressure from the freeze I'm sure) and lots and lots of test fixes. I know we have test.winehq.org which is fantastic, but it doesn't give a very good feel for progress that all of these recent patches have been making. Sure we can look at individual days and see how things go but there exists no real aggregate over time to see how the test suite is doing as a whole on all platforms. Do we have any such metrics/graphs? I think those would be very interesting and perhaps motivating. (Perhaps I should work on that to add to the last post in WWN's? Bugs / Appdb / Test Suite)
Also, Dan Kegel has been extraordinarily diligent in valgrinding of late (/applause) and I just wanted to ask if we're doing anything to support him? Once he posts to the list it seems that its the author's (of the breaking patch) responsibility to fix the problem. Some people have been very diligent in that (again, /applause) but I think some of Dan's posts have been left unanswered. I know Dan mentioned perhaps making Valgrind-passing a requirement (along with not breaking tests) for being committed. I think perhaps we should take a look at this and seriously consider implementing it.
--Zach