Please use this identifier to cite or link to this item: http://hdl.handle.net/11366/681
DC FieldValueLanguage
dc.contributor.authorMartens, Birgitteen_US
dc.contributor.authorVan den Berghe, Stevenen_US
dc.date.accessioned2018-05-15T15:46:44Z-
dc.date.available2018-05-15T15:46:44Z-
dc.date.issued2018-06-15-
dc.identifier.urihttp://hdl.handle.net/11366/681-
dc.descriptionExtended abstract accepted at the CRIS2018 Conference.-- See event programme at http://www.cris2018.se/schedule/en_US
dc.description.abstractResearch evaluation practices - and the measurement of research quality and impact they imply - are being looked at from very different angles. For research information managers transparency and access probably constitute the most urgent issues at the current moment, while R&D directors and research policymakers devote their attention to the ethical consequences and practical implications of open access and open data (NordFosk, 2016). There is another important pressing preoccupation of the latter group, namely the validity and robustness of current research evaluation protocols.en_US
dc.description.abstractExisting research evaluation models all depart from registered data in institutional and/or research policy CRIS. The most heard concern in research management circles relates to the identification and implementation of systematically collected and well-documented objective metrics for impact measurement, especially societal impact measurement (Bornmann, 2013). A second worry in the same midst has to do with the steady rise, both proportionally as with regard to policy appreciation, of practice-based research. In sharp contrast to the standard practice of communicating research results in the form of publications, outcomes resulting from applied research often are not communicated through writing but rather take on quite different dissemination forms. Registration systems for non-written output of national evaluation frameworks, research funding agencies etc. are not always open and considerate towards non-written output. Within these databases, non-written outcomes lack a systematic informative organization, therefore visibility and as a result thereof recognition in terms of quality and impact valuation (and the subsequent funding that is attached to that).-
dc.language.isoenen_US
dc.publishereuroCRISen_US
dc.relation.ispartofseriesCRIS2018: 14th International Conference on Current Research Information Systems (Umeå, June 13-16, 2018)-
dc.subjectresearch information managementen_US
dc.subjectresearch evaluationen_US
dc.subjectnon-written outputsen_US
dc.titleThe registration of art and design research outcomes: Visualizing the invisible (Case study: The Flemish FRIS-registration format)en_US
dc.typePresentationen_US
dc.relation.conferenceCRIS2018 – Umeåen_US
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
item.grantfulltextopen-
item.cerifentitytypePublications-
item.openairetypePresentation-
item.fulltextWith Fulltext-
item.languageiso639-1en-
Appears in Collections:Conference
Files in This Item:
File Description SizeFormat
Martens_VanDenBerghe_CRIS2018_paper_Art_and_design_research_outcomes.pdfExtended abstract (PDF)296.02 kBAdobe PDF
View/Open
Show simple item record

Page view(s) 50

287
checked on Apr 17, 2024

Download(s) 50

194
checked on Apr 17, 2024

Google ScholarTM

Check


Items in DSpace are offered under a CC-BY 4.0 licence unless otherwise indicated