Misplaced Pages

talk:India Education Program/Analysis: Difference between revisions - Misplaced Pages

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
< Misplaced Pages talk:India Education Program Browse history interactively← Previous editNext edit →Content deleted Content addedVisualWikitext
Revision as of 08:33, 2 December 2011 editMER-C (talk | contribs)Edit filter managers, Autopatrolled, Administrators250,734 editsm Outstanding Questions: fix← Previous edit Revision as of 11:22, 2 December 2011 edit undoAndy Dingley (talk | contribs)Autopatrolled, Extended confirmed users, Pending changes reviewers, Rollbackers160,249 edits Is analysis needed?Next edit →
Line 40: Line 40:


::Extolling the virtues of some relatively minor successes is only to cloud the real issues. The Pune experiment was a major success in that it has clearly demonstrated how ''not'' to plan, organise, and execute what are nevertheless extremely important initiatives in the effort to expand the encyclopedias and reach out to other cultures. The endemic gap between the salaried operatives and the volunteers needs to be closed, and there should be less unilateral action on the part of the organisers without listening to the community amongst whom are also some experts (who are not paid). The organisers have been aware for a long time that the relationship between WMF and the colleges is invisible, as well as the issues concerning the general planning and management of the projects from the top down and the absence of communication and cooperation with the online community.--] (]) 03:59, 2 December 2011 (UTC) ::Extolling the virtues of some relatively minor successes is only to cloud the real issues. The Pune experiment was a major success in that it has clearly demonstrated how ''not'' to plan, organise, and execute what are nevertheless extremely important initiatives in the effort to expand the encyclopedias and reach out to other cultures. The endemic gap between the salaried operatives and the volunteers needs to be closed, and there should be less unilateral action on the part of the organisers without listening to the community amongst whom are also some experts (who are not paid). The organisers have been aware for a long time that the relationship between WMF and the colleges is invisible, as well as the issues concerning the general planning and management of the projects from the top down and the absence of communication and cooperation with the online community.--] (]) 03:59, 2 December 2011 (UTC)

::: ''"important initiatives in the effort to expand the encyclopedias "''
::: That's my main point. Was this an effort to benefit the encyclopedia my adding content, by adding new editors, or to benefit the students/college by giving them an exercise? I don't ''want'' to care if this stuff is invisible or not, I don't want to even have to think about it, but when it's time for analysis we have to know what the original hope was, before we can judge whether it met it. All of these goals have some value to them and ''appear'' to be a simple goal to achieve. In actuality, none of them would be anything like so easy to meet.
:::* Expanding the encyclopedia is done by having competent people add competent content to it - as a minimum. Usually this arises through some personal passion for a topic and prior knowledge. Students who are arbitrarily assigned a topic are unlikely to have this. It was a huge problem for IEP engineering topics. Students ''obviously'' didn't know the topics, didn't stop and learn them before starting to write and had no inclination to even try to - most seemed to think that blindly pasting was enough, without any intervening comprehension. Engineering topics were also poorly worded as titles with no explanation as to what was meant.
:::* We're regularly told that we need new editors (despite the loss of good, undervalued editors being a far more serious problem). Recruiting them from students sounds like a good idea, but getting past ] needs a fair personal commitment, not just a course assignment. We not only asked students to edit, we asked them to deliver an acceptable article from cold, on their first article. That's hard to achieve on a first edit and it can only be done by people who know something well enough to write it, and who have enough editing skill to produce it. How do we train people to this level? As general quality standards are pushed higher, this is going to become a bigger and bigger problem - how do we get new editors past this skill gap without driving them away first by instant reversions and warnings?
:::* As a benefit to students, it's hard to see how it can possibly work. What's it for? To teach a topic, or to teach the technique of writing for a public audience? The second has some attraction, but where was the teaching? Students were dumped into the end of course assesment exercise without being taught anything beforehand. Just how was that expected to work?
:::* Any benefit to tutors is probably a bad idea from the outset. "Here's a chance for an easy self-marking exercise where a pre-existing community can be borrowed as tutors and assessors" is bad teaching, and sheer exploitation of the WP user community. I suspect this was one off the real goals behind this project.
::: The ones I feel sorry for in the midst of all this were the students. They were given a poorly thought out and broadly impossible task, then criticised from both sides afterwards. ] (]) 11:22, 2 December 2011 (UTC)


== Programmatic results == == Programmatic results ==

Revision as of 11:22, 2 December 2011

Outstanding Questions

A big question for me is: what is the best way to measure the impact of this program on the community? We certainly don't want to put more work on the community to estimate this impact, but we also believe measuring the program's impact on ordinary editors is critical to having a full picture of what happened in the Pune pilot. Any ideas of how to overcome this challenge? -- LiAnna Davis (WMF) (talk) 22:14, 1 December 2011 (UTC)

I think the best way to go about this would be to estimate some sort of "crap fraction" reflecting problems that need to be cleaned up in a similar manner to the PPP metric:
k 1 ( a + b ) + k 2 c + k 3 d + k 4 e m {\displaystyle k_{1}(a+b)+k_{2}c+k_{3}d+k_{4}e \over m}
with
  • a = copyvio edits in any namespace
  • b = copyvio uploads (local + commons)
  • c = mainspace edits deleted or reverted for other reasons. Redirecting an article counts as reversion.
  • d = mainspace edits with orange {{ambox}} problems (e.g. lack of RS, POV). If an edit has multiple problems, add the number of problems instead (e.g. unreferenced POV edit increases d by 2).
  • e = mainspace edits with yellow {{ambox}} problems (e.g. wikify, copyedit). If an edit has multiple problems, add the number of problems instead.
  • m = mainspace edits
Edits should be assigned to the category with the highest weight (e.g. G12 deletion => 10 points). The weights are arbitrary but reflect the severity of the problem: k1 > k2 > k3 > k4. I suggest something like k1 = 10, k2 = 4, k3 = 2, k4 = 1. MER-C 03:36, 2 December 2011 (UTC)

CCI

Attempting to assess the impact on the community of the CCI is virtually impossible at this stage because very little of the work has been done. The investigation is less than a week old and isn't going to be finished for a long time yet (we still have open two-year-old CCIs). I suppose you could investigate what percentage of contributions have been removed as copyright violations, but that's going to produce a severe underestimate because most of the edits haven't been systematically surveyed for copyright problems. Hut 8.5 23:56, 1 December 2011 (UTC)

Would there be data points like the backlog increase or anything like that we could use? We do have a couple of months to do this analysis, so we want to make it as thorough as possible, and if that takes waiting until the CCI process has had a chance to start so there is some time estimate able to be extrapolated, that's fine. -- LiAnna Davis (WMF) (talk) 00:03, 2 December 2011 (UTC)
You could count the number of edits, users or pages which have been reviewed, I suppose, but such a statistic might not be very meaningful because some users are much easier to check than others (some project participants have no edits apart from adding themselves to the project, for instance). It's possible that in a few months the first page of the CCI investigation might be complete, in which case you could get an idea of the number of copyright violations produced by a significant sample of users (196). Hut 8.5 00:17, 2 December 2011 (UTC)
Even that would be an underestimate due to copyright violations that were deleted before the CCI was run. To get a wholesome picture, you must consider both live and deleted edits. The CCI is also not complete: I noticed one user who posted a copyvio in the user talk namespace. The CCI backlog is so ridiculously large that the proportional impact of the IEP is rather small. The best way to assess impact on the community would be a crap edit (all namespaces+Commons)/potentially useful edit fraction weighted heavily towards copyvios. MER-C 02:13, 2 December 2011 (UTC)

Is analysis needed?

"We want to do a thorough analysis of the Pune pilot program to derive learnings and trends"

Or you could spend a few minutes asking some of the people on the (virtual) ground. This doesn't need statistics - a qualitative analysis (which half-a-dozen names could deliver in less than half-an-hour) would tell WMF more than I still suspect they want to hear. A snazzy Powerpoint with figures on is no excuse for A Clue. Some of the basics of what went wrong are very obvious and they just need fixing. We don't care whether they went a lot wrong or a little, they went too wrong, and they need to be avoided completely in the future. Analysing trends will not explain any of this.

I'd note also that some of the really deep questions rely on the so-far invisible relationship between the WMF and the Indian colleges. Who expected to gain what from this entire process? Without knowing what the intention ever was, it's hard to know just where it went wrong. Andy Dingley (talk) 00:18, 2 December 2011 (UTC)

Hi Andy, Please note that both quantitative and qualitative analyses are happening! As is mentioned, Tory Read (our outside evaluator) is interviewing a few Wikipedians for her report (I see from Hisham's talk page that you were one of the ones asked). We've also been taking note of all the information that's been previously mentioned on the talk pages. If you think we need more editor interviews, suggest that! We want to do such a thorough analysis because everyone we talk to has a different answer of what's wrong with our program, and I think that's because our program design was flawed on multiple points. Fixing one or two of the problems will just result in another frustrating pilot; we want to take the time to plan the next one right, and that involves a very thorough analysis to identify all the reasons why our first pilot failed, and what we can do to fix those problems.
I'm glad you said you think the relationship between WMF and the colleges is invisible, as I wasn't aware that was an issue. The Misplaced Pages Education Program (this is true for our program in India as well as our current programs in the U.S. and Canada) is designed to get more people contributing to Misplaced Pages and to improve the quality of the content of Misplaced Pages (you'll recognize these as goals from the Strategic Plan). Obviously the quality part failed miserably in our Pune pilot, but data from our U.S. pilot showed that article quality improved 64%, which led us to expand the pilot to other countries, including India. The India program is designed to specifically address the Global South parts of the Strategic Plan. You can see the benefits for instructors on the Education Portal's Reasons to Use Misplaced Pages page. If you're looking for more details on the communication between WMF and various other parties in India, I encourage you to check out the Misplaced Pages:India_Education_Program/Documentation page. I'd be happy to clarify any other questions you have about this -- I'm really sorry it wasn't clearer before! -- LiAnna Davis (WMF) (talk) 00:55, 2 December 2011 (UTC)
Extolling the virtues of some relatively minor successes is only to cloud the real issues. The Pune experiment was a major success in that it has clearly demonstrated how not to plan, organise, and execute what are nevertheless extremely important initiatives in the effort to expand the encyclopedias and reach out to other cultures. The endemic gap between the salaried operatives and the volunteers needs to be closed, and there should be less unilateral action on the part of the organisers without listening to the community amongst whom are also some experts (who are not paid). The organisers have been aware for a long time that the relationship between WMF and the colleges is invisible, as well as the issues concerning the general planning and management of the projects from the top down and the absence of communication and cooperation with the online community.--Kudpung กุดผึ้ง (talk) 03:59, 2 December 2011 (UTC)
"important initiatives in the effort to expand the encyclopedias "
That's my main point. Was this an effort to benefit the encyclopedia my adding content, by adding new editors, or to benefit the students/college by giving them an exercise? I don't want to care if this stuff is invisible or not, I don't want to even have to think about it, but when it's time for analysis we have to know what the original hope was, before we can judge whether it met it. All of these goals have some value to them and appear to be a simple goal to achieve. In actuality, none of them would be anything like so easy to meet.
  • Expanding the encyclopedia is done by having competent people add competent content to it - as a minimum. Usually this arises through some personal passion for a topic and prior knowledge. Students who are arbitrarily assigned a topic are unlikely to have this. It was a huge problem for IEP engineering topics. Students obviously didn't know the topics, didn't stop and learn them before starting to write and had no inclination to even try to - most seemed to think that blindly pasting was enough, without any intervening comprehension. Engineering topics were also poorly worded as titles with no explanation as to what was meant.
  • We're regularly told that we need new editors (despite the loss of good, undervalued editors being a far more serious problem). Recruiting them from students sounds like a good idea, but getting past WP:BITE needs a fair personal commitment, not just a course assignment. We not only asked students to edit, we asked them to deliver an acceptable article from cold, on their first article. That's hard to achieve on a first edit and it can only be done by people who know something well enough to write it, and who have enough editing skill to produce it. How do we train people to this level? As general quality standards are pushed higher, this is going to become a bigger and bigger problem - how do we get new editors past this skill gap without driving them away first by instant reversions and warnings?
  • As a benefit to students, it's hard to see how it can possibly work. What's it for? To teach a topic, or to teach the technique of writing for a public audience? The second has some attraction, but where was the teaching? Students were dumped into the end of course assesment exercise without being taught anything beforehand. Just how was that expected to work?
  • Any benefit to tutors is probably a bad idea from the outset. "Here's a chance for an easy self-marking exercise where a pre-existing community can be borrowed as tutors and assessors" is bad teaching, and sheer exploitation of the WP user community. I suspect this was one off the real goals behind this project.
The ones I feel sorry for in the midst of all this were the students. They were given a poorly thought out and broadly impossible task, then criticised from both sides afterwards. Andy Dingley (talk) 11:22, 2 December 2011 (UTC)

Programmatic results

To be certain we are measuring what needs to be measured, I'd like to see more detail regarding the parameters/dimensions of the program. I suggest starting with a list of the program goals and the planned program components and add a list of the unintended/unexpected outcomes. Then quantify those items and analyze the relationships among them. Jojalozzo 00:51, 2 December 2011 (UTC)

Great idea. I'll start working on this list. -- LiAnna Davis (WMF) (talk) 01:11, 2 December 2011 (UTC)

Core issues

The answers do not lie in added bureaucracy: proposed solutions and metrics are all available already in the many and various comments by community members on the increasing maze of talk pages connected with the IEP and USEP projects. The impact on the community is blatantly obvious - why keep creating yet more pages to add to the confusion, and carry out further costly analysis when the answers are already staring us in the face complete with charts and graphs already provided, and have been discussed in depth on the mailing lists and other obscure lines of discussion? The main answer lies in rectifying the continuing lack of communication, transparency, and admission of errors (see 'introspective' in my post above). The main concern raised from the Pune pilot from the community angle is not being addressed, and LDavis and/or AnnieLin have made it clear elsewhere that they do not consider it part of their remit to take the community resources - the very impact that is being mentioned here - into consideration when planning their education projects; ignoring the known problems and trying to find solutions to new new ones that apparently still need to be identified is a redundant exercise. Ultimately, this will simply foster more ire and drive yet more volunteers and OAs away from wanting to be helpful, rather than solicit their aid which in any case can only be to repeat what they have already said time and time again. Solutions and suggestions have been tossed around by some extremely competent and knowledgeable members of the volunteer force only to land repeatedly in some kind of no man's land between the WMF and the community. It is imperative to understand that all education programmes will generate more articles - which is of course the goal of the initiatives, and which is recognised and supported in principle by everyone - that will still need to be policed by experienced regular editors, and that these programmes cannot be implemented before the online volunteer community is forewarned, and forearmed with the required tools and personnel. Perhaps Tory's independent analysis will come up with some answers (and I'm confident it will), and it may be best to wait for her report. . Kudpung กุดผึ้ง (talk) 03:33, 2 December 2011 (UTC)