A 360 feedback report is a fantastic tool for development. There are lots of companies out there who can create a fancy 360 feedback report for you. What if your L&D budget doesn’t stretch that far and you want to complete something in-house?
This was the situation I was faced with. I thought ‘there must be a way to create this in-house?’
With a bit of creative thinking and a little bit of time. I managed to create something that wasn’t too dissimilar to a professional 360, and still totally anonymous!
This article will take you through every detail you need to know to create your own in-house 360 feedback form. Be warned, this is a lengthy and detailed step by step guide so if you’ve not got the time now. Bookmark it for later.
By the end of this post you will know step by step how to create an in-house 360 report and learn from the mistakes I made doing this for the first time.
Disclaimer: This article will not cover how to effectively facilitate a 360 feedback discussion. You should consider carefully if you are the best person to administer feedback. 360 feedback should be non-judgemental and impartial.
Preparing your competency questions
Identify the purpose of your 360 feedback report
Is the purpose of your 360 part of a specific development programme or a generic organisation wide offer? The two may require different sets of competencies.
Organisation wide – Does your organisation have a competency or behavioural framework? If so, align your 360 feedback report to this and some of the hard work has been done for you. If not, but you’d still like some alignment, incorporate the company’s values as behavioural statements.
Development programme – If it’s for a specific development programme you firstly need to scope out the key themes you are looking to achieve with the programme. Which you should’ve already done in your up-front training evaluation. With clarity on the behaviours the programme seeks to improve you can build clear competencies around these.
How do you identify your competencies to assess against?
Now you know the general theme for the competencies there are a few other things to consider.
- Number of competencies – No more than 5-7 main competencies with no more than 6 sub competencies. The longer the 360 the less chance the feedback will be valuable. Keep it short.
- What the competency statements will be – If you’re stuck, a quick google search will provide you with some inspiration. Appraisal 360 have a great list of competency framework suggestions to get you started.
- What’s your scoring criteria? – I used a rating scale from strongly disagree to strongly agree on a 4-point scale. This encouraged forced choice and was to stop individuals sitting on the fence. I also added a N/A (not applicable) option for if the rater hasn’t observed that specific competency.
Identify any other open text questions you want to ask
Open text questions are important to include as they provide context on the scores. Without context you’ve just got a lot of numbers. From personal experience, people need the open text responses to make sense of the scores, so make sure you include some at the end.
My open text questions were:
- What two things they can do to improve?
- What two things they do well?
- Any other feedback for this person?
How to create the 360 feedback report survey
Before you create anything in the survey, finalise it in word first. It takes so much longer to work directly in the survey software than it does to copy and paste once you’re happy! Below are my tips for creating the survey.
Choose a survey platform
There are several platforms out there, such as survey monkey, Qualtrics etc. Most of which have a free ‘lite’ version, you do get more scope though if you purchase a package. Check first to make sure the level of package you have meets your needs.
Put a clear introduction page on the 360-feedback explaining how to complete and why
This is important because those invited to rate can be sceptical about feeding into 360 feedback. They mainly want to know if its anonymous and how the data will be used. Things to mention in the introduction are:
- That the 360 responses are all anonymised but providing specific examples may reduce their anonymity i.e. stating ‘In Mondays meeting…’
- That the feedback is for developmental purposes
- Explanation of the rating scale and when to use N/A option
- That honest, but not personal feedback is required
Make questions mandatory
I’d recommend making questions mandatory if you have this functionality (most do) – to make sure no sections are skipped.
Keep open text questions for the end only
Don’t have an open text question at the end of each competency section. I had this the first time and it wasn’t getting many responses and the 360 lacked context. After getting feedback, I found it’s better to just have the final open text questions rather than throughout.
Keep it anonymous
Make sure the form is completely anonymous. No names, locations or other identifying info that might put people off completing.
Remember to have an identifier
The only identifier should be what type of ‘rater’ they are. I’d recommend splitting the data in the report into rating groups. As the first time I did it, I lumped all the data together into one score. When I had to discuss these ratings in a feedback session, the individual found it difficult without knowing the difference in scores between their line manager and direct reports.
For reference, I used:
- Peer assessment
- Line manager assessment
- Direct report assessment
Who they are completing the survey on behalf of
If you are sending out multiple 360’s at the same time have an open text box in the survey for the name of who you are completing on behalf of. I.e. I am completing this report on behalf of ‘Sophia Grainger’.
Identify the raters
Decide who you want feedback from. I use self-assessment, peer assessment, line manager assessment and direct report assessment. This is up to you; some 360 reports don’t include peers.
Whoever you chose, anonymity is important. Send it to a minimum of 3 people per rater group. For example, if an individual only has 2 direct reports then this reduces anonymity and their scores shouldn’t be shown in their own section. In this instance you might have to either merge their scores or find more raters.
Inform the line manager that their rating isn’t anonymous.
Ask the person receiving the 360 feedback report to identify which peers would be able to complete on their behalf (if your using them). Once identified, ask the individual to contact the ‘peers’ to let them know they’ve chose them. This makes them more likely to respond. Remember, there needs to be at least 3 for anonymity.
Email the individual raters to ask for feedback:
Send the email yourself, to increase the anonymity to the individuals. Give a clear completion date to complete by and send it blind closed copied.
Send a chasing email to the entire group one week before the deadline. Since you aren’t able to track completion.
How to manipulate the data in Excel
If you’re new to Excel or unsure how to manipulate. I’ll walk you through all the steps I took.
I exported all the data to Excel
If required, turn your ‘strongly agree’ ‘agree’ cells into a number, you can do this quickly by using the ‘replace’ function. I make:
- 1 = strongly disagree
- 2 = disagree
- 3 = agree
- 4 = strongly agree
Split the data into the rater groups. For me, self-assessment, line manager assessment, direct report and peer assessment.
Add an average for the scores of all the sub competencies. The formula I used can be found in the formula box.
Create averages for each sub-competency, for groups with more than one rater. Again, the formula I used can be found in the formula box.
Copy and paste the two single score rows from self-assessment and line manager and the average score rows from direct report and peer assessment. Make sure you copy and paste the average rows as plain text, or the scores might not transfer correctly.
The data you’ve just copied together in plain text format from all the different raters can be used to make your graphs. I went for a bar chart.
Take the summary of the average’s column from each competency section, (step 4) to make a spider diagram of the key themes. For reference, I had 7 competencies. I took the average scores from the 7 competencies and from the 4 different rating groups to make the graph below. This data became the spider diagram.
How to turn the data into a 360 feedback report
This is an example of the report I created, I kept it simple.
Create a word document
Share the total number of respondents
Start with the key theme’s spider diagram
Add in a page per each competency and the bar charts you created
Add in the free text questions at the end
Delivering the 360 feedback report
I just want to reiterate the point that 360 feedback should be handled carefully. If you aren’t experienced in coaching, or providing impartial feedback, I suggest you bring someone in who can.
Another option is to get yourself some training first before you start delivering the feedback. Decision wise offer an online webinar for 360 feedback training, which could be a good place to start.
Phew, you made it all the way to the end! That. Was. Detailed!
This article should’ve given you enough information to go ahead and create your own in-house 360 feedback report. Good luck!
Want more great L&D content like this? Sign up to my mailing list. As a thank you, I’ll send you my free little ebook of learning and development essentials.