Innovation and transformation, as well as the emergence of disruptive business ecosystems, supported by various software, have gained increasing significance. One approach to tackle this complex task is Design Thinking, which applies designer problem-solving techniques for agile, ideation, prototyping, and testing in innovative processes through collaboration among stakeholders.
The goal is to generate ideas by using different design thinking methods, based on tangible visualization of certain aspects of the problem within a developed solution space, where collaboration among stakeholders plays a central role. Design Thinking enables early exploration and validation of design(s) of new services, smart products, and disruptive business models, but it restricts to location and temporal availability of stakeholders. Absent stakeholders must be informed afterward, which is often not directly supported by the Design Thinking methods applied.
This hands-on tutorial allows, through the Scene2Model software tool, a transformation of the physical visualization into digital conceptual models, so that they can be processed and used within modelling tools, further decomposed, and combined with available enterprise assets. Scene2Model enables a software-supported design thinking environment that is location and time-independent, thus facilitating the collaboration of globally distributed networks and stakeholders, implied by the digital transformation and globalization of businesses.
The Scene2Model software tool supports early prototype development in any domain, as well as further processing of desired specifications for the outcomes. The interplay of Conceptual Modelling and Design Thinking establishes a connection between unrestrained design artefacts and more formal abstractions (e.g., business process models).
The tutorial will introduce participants to storyboards as a selected Design Thinking method. Haptic paper figures (SAP ScenesTM) are used to depict scenes and build storyboards in the context of Smart Mobility domain. Participants will observe the end-toend process of a tool-supported transformation from haptic paper objects into digitized models. The Scene2Model tool, not only enables this transformation, but it also semantically enriches the models and allows their automated composition into storyboards.
Scene2Model can be obtained at https://www.omilab.org/activities/projects/details/?id=131