Scam detector tuneup

As hinted in Phish with no smell , some scams are hard to detect. How hard?

False confidence

Let’s try 7 Ways to Recognize a Phishing Email (published on Security Metrics).

Assertion Real world
Legit companies don’t request your sensitive information via email Actually, they do. Many accountants and lawyers ask for sensitive documents be sent to them by email.
Legit companies usually call you by your name So do scammers. I regularly receive spam containing my name
Legit companies have domain emails So do scammers. See the Salvation Army example above.
Legit companies know how to spell So do scammers. See the Salvation Army example.
Legit companies don’t force you to their website Actually, they do. Paypal, banks and airlines require you to visit their website to get even routine information. Software vendors commonly send whitepapers as individualised links to their website. Training organisations commonly send tokenised links to webinars or videoconferenced meetings that nonetheless require sign-in through a portal.
Legit companies don’t send unsolicited attachments Actually, they do. Many include a banner or footer which are detected by email readers as attachments or embedded images. Outsourced services, such as our voicemail system, may deliver unexpected attachments.
Legit company links match legitimate URLs So do scammers. Some scammers use realistic substitute URLS (as in the Salvation Army example above) and some use real links, to cheaply add authenticity.

The danger with the rules above is that they are such strong assertions.

A better guide

Take precautions. (See Scamwatch)

  • Do not click links or attachments (or add apps or extensions) that you did not expect.
  • Google the wording of a message to find out whether others have reported it.
  • Expect a secure (https) site – a closed padlock on the address bar.
  • Never provide personal details to a caller. Call the organisation through its official channels yourself.

Diversity of Fish

Phish with no smell

We’re catching a better class of Phish.

I’ll be honest: this one was detected by a machine, not a human. (It was flagged as possible spam by a mail-server, but still delivered.) This example was –

  • Plausible because it very accurately mimics the email and website of a legitimate, trustworthy organisation.
  • Confirmed as fraudulent because the Reply address, buttons and links point to salvationarmyeast.com which is not the usual domain of the real Salvation Army and does not have a website. (Right-click a button and read the domain after the @ or the https://)
  • Risky because attempting to donate may lead you to disclose credentials, identity and/or financial details which could be cross-referenced with data harvested in other scams.

Sample phishing message

Unreasonable lengths

What would it take to avoid this trap?

  1. Be skeptical of the first message – or any unusual message – from everyone.
  2. Get contact details from a more trustworthy source: a mutual contact is good, but Google is better than nothing.
  3. Whitelist the trusted contact details, and block unverified senders that look similar.

But this mindset is not discriminating enough for school. While a student probably could use these rules, they are impractical for anyone who needs new customers – most small business and freelancers. To equip students for entry to an increasingly casualised workforce, we need to better understand and convey how trust can be established between strangers.

Man holding fish

Why the Principal is not a Superuser

It’s obvious to the data team, but can be perplexing to school executives, that the Head does not need “administrator” role in the School Information System.

While we do try to make their work as fluid as possible, executives need to understand:

  • The data is very incomplete. When you look at it you are missing significant meaning that comes from our detailed practices and undocumented understandings that are the responsibility of dozens of other workers in the school. You need their eyes and experience, not just your own, to make sense of it.
  • The application is very noisy. It holds millions of records – more than any person can grasp. Even gaps are significant, but not obvious. You need everything irrelevant stripped away.
  • The data is fragile. Minor changes or corrections can have unintended effects on the real-world treatment of students and staff. As far as possible, we delegate that responsibility to staff who understand the significance of corrections; we wrap changes in procedures to ensure consistency between systems; and we expect the editing staff to devote time to learning minute detail of how the school handles data changes.
  • The data is sensitive. Real people need effort to switch mindset as they shift from pattern-scanning in large lists to individual pastoral considerations for children or staff with confidential financial, medical, legal or pastoral notes. Inconvenience in the application works to assist appropriate caution.

This is not a flaw: it is inherent in the complexity of an organisation with hundreds of clients and accountability for millions of items of data.

The key survival strategy for executives is to deeply understand “dashboard” reports that focus on their strategic issues, and to make use of staff who have deeper awareness of the implications of the data.

Image credit: wikimedia

Zoom outriders

A talented teacher asked,

I have a student who will still require ongoing Zoom to have lessons whenever I give face to face lesson to the rest of the class with OneNote [on a big touchscreen in the classroom]. I need to know how can I give lesson to one via Zoom and the rest face to face.

It is technically possible to plug a USB microphone into the classroom PC to work in Zoom sessions.

However, it takes more than electronics. It’s easy to underestimate the expertise involved in classroom teaching. In your physical classroom you are monitoring understanding through your highly developed peripheral awareness and familiarity with classroom behavioural indicators. Evaluating learning over Zoom is hard due to restricted vision, sound, contextual awareness and the challenge of attention-switching.  To compensate, you would need to give each site equal attention, whether it be the home with one student or the classroom with 25.

We normally work hard to keep all outside distractions out of the classroom. Your in-person class may feel short-changed. And for the remote student, we cannot expect the Zoom session to be as interactive or reassuring or clear as being in your classroom.

What learning benefit do you want to provide?

How can an absent student become confident that they understand the content, or that they need to ask more about it? Students in the classroom look at peers’ work and body language. Zooming your talk to the class won’t provide that. Instead you might offer, for example:

  • pair-work with another student over Zoom (using its whiteboard feature)
  • recorded solution demonstrations with space for question/comment (in OneNote or ClickView interactives)
  • a class space for practice-question discussion and solutions (in a Q&A forum or an open Zoom meeting)
  • an audio recording of your in-class explanation of OneNote material, captured using OneNote audio
  • individual followup and support.

The tough teaching challenge is working out what you can do efficiently with the time you’ve got.

Cheating and Entrapment

Lecturer statement reported by students: 

I worked with TA’s to create this uniquely worded question and submit the answer on Chegg – an online homework soling website… If you answered this question “correctly” on the final, you will receive a 0 for the exam, and will be reported to the university for violating the academic integrity policy.

What do students learn during your exam?

Students do learn during exams. If an exam contains deception or untruths, it potentially undermines the confidence or expertise of our graduates. If the exam reveals to them that our staff are hostile or unreliable, it undermines our value as an authoritative source for our graduates and their employers. Perverse strategies like this can only be defended if there is no other way to achieve our assessment aims.

There is a more honest and constructive path.

For example, giving advance notice, ahead of the exam period, that one of the questions is “impossible” and will be scored for insights rather than results. Or, if it’s a remotely proctored exam, include a question that calls for a “talk aloud” (real-time stream of consciousness) response. Or include questions that score students for evaluating alternative solutions in dialogue with another (assigned) student.

Cheating is a breach of trust. So is entrapment.

Why did you think the trust between you and students was stronger than their need to get a grade? Did students perceive your investment in relationship-building and their need for grades?

Let’s be clear. Publishing deliberate errors is not exemplary academic integrity.

Your assessment plan is broken.

Voiding 29% of your exam assessment indicates that you need a different assessment model. You can reduce the reward for misbehaviour by using incremental assessment. We have a large kit of alternatives to high-stakes closed-form exams.

Is cheating a professional competence?

If your discipline or industry regards high-stakes, closed-form exams as being the most authentic assessment, it has married tacit tolerance of this kind of cheating to strenuous denials. As an institution educating future professionals, we (ethically) owe students an honest appraisal of that context, and assistance in developing their optimal strategies for advancement. We need to ask how we can train students to excel in a cheating-prone discipline.

Edutech implementation as resituation

Changing your own practice is real work.

 

Westberry (2015) unpacks reflections of 14 lecturers who switched to multi-site lecturing.

Experiences and responses

Some good teaching intentions were frustrated:

I conceived it as sort of having windows where you could, could talk or see or respond to people in other rooms, and that we would be a community…

I always use teachable moments … all this technology absolutely puts a down on that whole teaching system, because you have to abide by some rules, that you have to go on with the flow and be on camera and be on shot…

Sometimes the solution is to be less ambitious.

While the technical staff believed that more interactive versions of the technology were needed teachers on all four courses adapted by limiting interactivity or resituating it in other contexts such as tutorials.

Solutions often involved increasingly collaborative teaching practice.

Almost all teachers in the remote venues however, experienced uncertainty and a lack of control…

Time – perhaps a year – is required.

…the lack of professional development opportunities for staff and daily technical breakdowns led to replacement with another system after one year of operation.

They describe the teachers’ adaptation process as “often turbulent and multifaceted”, and,

in a situation of perceived compromise, beliefs remained predominantly unchanged.

things are a bit out of control – that [the lecture] it is not a very safe place, that the ground is shifting and it can all just disappear at any minute. (Technician)

Implications

Teachers need to be able to openly voice and negotiate pedagogical and technological choices.

In conclusion, the authors argued

it is imperative for technological changes to be introduced either in a way that is aligned with teachers’ current knowledge and ways of working, or with the support and time needed to effectively resituate them.

References

Nicola Westberry, Susan McNaughton, Jennie Billot & Helen Gaeta (2015) Resituation or resistance? Higher education teachers’ adaptation to technological change. Technology, Pedagogy and Education, 24:1, 101-116. DOI: 10.1080/1475939X.2013.869509

Michael Eraut (2004) Informal learning in the workplace. Studies in Continuing Education, 26:2. DOI: 10.1080/158037042000225245

Does study cause procrastination?

20% of normal adults, 50% of college students, are chronic procrastinatorsWhy is procrastination normal at university, more than in the general population? Tim Pychyl says procrastination is about feeling good, now. (Pychyl, 2012)

I know what I should do, but I just want to have some me-time first. (A failure of self-regulation.)

But friends and peers should help. The example of others around me helps me feel ready to make a better choice. (Social regulation.)

So when we see a crowd of procrastinators together, I wonder,

  • Does this crowd attract procrastinators?
  • Or does this crowd make procrastinators?

 

References

Ellis, A. & Knaus, W.J.  1977, Overcoming procrastination, Signet Books, New York.

Harriott, J. & Ferrari, J.H. 1996, Prevalence of procrastination among samples of adults,
Psychological Reports, 78:611–616

Michinov, N., Brunot, S., Le Bohec, O., Juhel, J. & Delaval, M. 2011, “Procrastination, participation, and performance in online learning environments”,Computers & Education, vol. 56, no. 1, pp. 243-252

Pychyl, T. (2012) Teaching Talk: Helping Students Who Procrastinate. EDC Video Channel, Youtube.

Scholarship v Plagiarism

Plagiarism spectrum & Did I Plagiarize? are beautiful, concise expressions of poor practice, but I wish we could focus instead on a spectrum of valuable scholarship.

Academic writing is a continuation of a worldwide, centuries-long conversation, and if my writing doesn’t connect to the conversation, or doesn’t advance it, I don’t count. 

  1. Is it easy to verify the source of my ideas, words, images and data? (Unverified = untrustworthy.)
  2. Have I contributed any original understanding or expression? (No insight = insignificant.)
Author typing on laptop surrounded by newspapers

Credit: dave, 2004

Declarations of interest

Is it coincidence that the ideas for both charts are ultimately sourced from iParadigms, which profits from beliefs about plagiarism, while protesting that Turnitin is not a plagiarism detector?

Turnitin is important in my day job. I encourage academics to provide Turnitin in their courses, to allow students to check for unoriginal text before submitting their writing; and I train staff to use GradeMark to provide auditable, inline feedback on assignments to students.

Like most educationalists, I recognise emulation as a stage in a learning process.

References

iParadigms (2013) Does Turnitin detect plagiarism? Turnitin. Online at http://turnitin.com/en_us/resources/blog/421-general/1643-does-turnitin-detect-plagiarism retrieved 26/09/2014

iParadigms (2012) The Plagiarism Spectrum. Turnitin. Online at http://www.turnitin.com/assets/en_us/media/plagiarism_spectrum.php retrieved 26/09/2014

Newbold C (2014) Did I Plagiarize? TheVisualCommunicationGuy. Online at http://visual.ly/did-i-plagiarize-types-and-severity-plagiarism-violations retrieved 26/09/2014.

Schaum B, Stohrer M (2014) Plagiarism, Emulation, and Originality. Slideshare. Online at http://www.slideshare.net/beths0103/plagiarism-emulation-and-originality retrieved 26/09/2014.

dave (2004) pro_author. Morguefile. Online at http://mrg.bz/nH5vv9 retrieved 26/09/2014

Complex activities in Moodle

Learning design

 

Can you build an extended, interactive lesson in Moodle?

Yes!

I’ve seen (and set up) some quite complex sequences. The activity mapped out here engaged students over several weeks; several meetings of teams, individual, small group, large group and plenary work; and was highly active and personalised.

After a team selection and formation activity, individual students submit a short video (explaining a revision topic) for peer assessment, and review several others for correctness and effective communication (inter alia). Each team chooses and submits their best video to a shoot-out in a tutorial for team points; and the best of each tutorial goes into a shoot out in a lecture for team points. The video collection provides revision resources for the next cohort.

However…

Complex, extended learning designs are hard to maintain, and hard to migrate to a different platform in the future. How portable is the learning design (to the next version of device, browser, or LMS)? How soon will it appear maladapted to incoming learners? How many people are competent to revise it? How costly is it to modify and test as the curriculum or content changes?

And perhaps more importantly, can it flex for students with unique needs or circumstances?

Acknowledgements

The course design illustrated was originally developed and piloted by Waldron, Kinkaid and Whitty for UNSW Australia.

Illustration prepared with CompendiumLD, Open Learning UK’s special purpose adaptation of the open-source software Compendium, available from http://compendiumld.open.ac.uk/

Active Learning beats Lectures

Finding

Active Learning approaches are more effective than lectures, at delivering information to undergraduate science, engineering and mathematics students.

Brunel University student study group.

This finding may seem unsurprising, because it is a result of a review of many published studies that may already have drawn our attention, but it is compelling. (The mean achievement of Active Learning groups was 0.47 standard deviations above the mean of Lecture groups. Failure rate was roughly halved, and attrition rate was reduced.)

The authors argue that it is now unethical to use lecturing instead of Active Learning approaches (for knowledge delivery), knowing that the effect is so great.

Definitions:

The term Active Learning, in their review, means a constructivist teaching/learning approach: provide experiences that challenge students to build their own understanding and competence, and detects and corrects their understanding, rather than just providing information. Discussion in tutorials can do this.

Reservations

  • One reasonable objection is that the instructors were volunteers, who might be more motivated than average lecturers. Reluctant instructors might be unable to deliver the expected benefits of an Active Learning design.
  • The benefit may indicate improved alignment between learning method (lecture) and assessment method (examination).
  • The only outcomes measured were performance on the unit assessment. This did not look at long term impact on learning expectations, behaviours and competence. If, for example, reflecting on lectures was actually the professional competence that graduates need, this finding would be irrelevant – but the assessment method should be changed to align better with that goal.

Reference

Scott Freeman, Sarah L. Eddy, Miles McDonough, Michelle K. Smith, Nnadozie Okoroafor, Hannah Jordt, and Mary Pat Wenderoth (2014) Active learning increases student performance in science, engineering, and mathematics. PNAS 2014 : 1319030111v1-201319030. Online at http://www.pnas.org/content/early/2014/05/08/1319030111 accessed 16/06/2014.