A crowd-based evaluation on the impact of domain expertise on query formulation, relevance assessment and retrieval performance. Queries issues by experts are significantly longer and more technical than queries issued by novices. There is a low level of relevance assessment agreement among both experts and novices but the reasons of the relevance assessment difficulty significantly differ between them. Traditional information retrieval models which mainly consider the presence or absence of query terms within documents are particularly unsuccessful for experts who rather leverage from their knowledge and past experience to assess a multi-dimensional relevance.