Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Integrate speaker feedback tool #10

Open
hlashbrooke opened this issue Aug 4, 2020 · 6 comments
Open

Integrate speaker feedback tool #10

hlashbrooke opened this issue Aug 4, 2020 · 6 comments

Comments

@hlashbrooke
Copy link
Collaborator

@hlashbrooke hlashbrooke commented Aug 4, 2020

The speaker feedback tool was built for WordCamps. It would be great to activate it on the Learn site for workshops. https://make.wordpress.org/community/handbook/wordcamp-organizer/speaker-feedback-tool/

CC @coreymckrill

@hlashbrooke
Copy link
Collaborator Author

@hlashbrooke hlashbrooke commented Aug 24, 2020

This is a high priority for the next iteration, but we need to have some discussion about what questions we'll be asking. I'll comment here later with some thoughts.

@hlashbrooke
Copy link
Collaborator Author

@hlashbrooke hlashbrooke commented Aug 24, 2020

The WordCamp speaker feedback tool asks attendees to submit a star rating for the session and then answer three questions:

  • What was great in this session? (required)
  • What could have improved this session?
  • Do you have any additional feedback?

This feedback is vetted by the organisers and approved, with approved feedback being automatically shared with the speakers.

This same flow would be great for Learn WordPress, but I think we should modify the questions to be more focused on the objective of learning something tangible, so here is my first pass at updating these questions:

  • What did you learn in this session? (required)
  • Is there anything else you expected to learn that wasn't included?
  • Do you have any additional feedback?

These would be submitted along with a star rating. Since the answers would be seen by the presenters, these questions would allow them to get an idea of how well their content is being received and if there's anything they need to tweak in order to make it more effective.

Some questions coming out of this:

  1. Would we be able to run some sentiment analysis on the answers to get a view of what people are saying?
  2. How would we handle the vetting of the feedback? For WordCamps, the organisers approve each feedback item, so maybe for Learn, it can be one of the responsibilities of workshop reviewers to periodically check for new feedback and approve it.
  3. Since Learn isn't tied toa specific event date like a WordCamp is, how would that affect things? The feedback form would need to be permanently available.
  4. Where would the feedback form live? Directly on the workshop page, on a separate page that is linked from the workshop page, or something else?
@angelasjin
Copy link

@angelasjin angelasjin commented Aug 24, 2020

Adding to @hlashbrooke's suggestions for question modifications, can we add something along the lines of "Were learning objectives clearly presented?"

@CdrMarks
Copy link
Collaborator

@CdrMarks CdrMarks commented Aug 24, 2020

Related to the Collecting and Reporting Stats for Learn WordPress Discussion Groups P2 I posted, I'm wondering if Learn's implementation of the speaker feedback tool could help us determine the attendee's competency level growth on the topic by asking two question:

  • Before watching the workshop and participating in the discussion, my competency level on the topic was:
  • After watching the workshop and participating in the discussion, my competency level on the topic is:

Both of the questions above would be answered with a numeric value from 1-10.

I'd also suggest we can ask how valuable the user found the following components:

  • the workshop
  • the discussion

For both of the components above, we could present 3 options to select from: not valuable, somewhat valuable, very valuable.

@hlashbrooke
Copy link
Collaborator Author

@hlashbrooke hlashbrooke commented Aug 24, 2020

I like that! How about we go with this set of questions then:

  • Star rating
  • How competent were you on the topic before watching the workshop? (rating 1-10)
  • How competent are you on the topic after watching the workshop? (rating 1-10)
  • What did you learn in this session?
  • Is there anything else you expected to learn that wasn't included?
  • Do you have any additional feedback?

Is that too many questions? Only 3 of them are text questions - the others are ratings only.

At the time then we're asking people to fill in this survey, they would have just completed the workshop and not taken part in a discussion group yet. We have a separate survey for discussion group attendees, so asking about the value of the group at this stage isn't really practical.

@CdrMarks
Copy link
Collaborator

@CdrMarks CdrMarks commented Aug 24, 2020

Since this is a set of questions to be asked between the completion of the workshop, but before the discussion group, I wonder how quickly the discussion group leaders would be able to see these answers. The last two questions proposed by Hugh above would be good for the discussion group leaders to focus on.

  • Is there anything else you expected to learn that wasn't included?
  • Do you have any additional feedback?
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet
4 participants
You can’t perform that action at this time.