The objective of this article is to present a qualitative analysis tool which technology transfer offices (TTOs) can utilize to improve their efficiency and effectiveness. Such qualitative tool is one of the novelties presented. The other is information that advances understanding of the processes, procedures and structures required to transfer technology, as a set of best practices. From December 2008 to September 2010 a variety of methodologies (document analysis, participative observation, interviews and surveys) generated data which led to development of a theoretical framework. The theoretical framework, called Master Plan for Technology Transfer (TT), is a reference schema for best practices. The Master Plan contains 271 rules (good practices) referring to 43 facilitators distributed in seven groups. The facilitators and rules were selected from a coding process based on grounded theory, where facilitators are the categories and rules are their properties. Based on the methodologies and development of the Master Plan, we constructed a tool called Best Transfer Practices (BTP) which is a qualitative tool to assess and study TTOs and their host R&D institutions. The collection of rules and facilitators are the soul of our BTP. It is our contribution to the knowledge of actual practices in TT. ► We made a qualitative analysis tool to assess technology transfer offices (TTOs). ► We developed a theoretical framework called master plan for this. ► The master plan is a reference schema for best practices in technology transfer. ► The master plan contains 271 rules (good practices) referring to 43 facilitators. ► The facilitators/rules were selected from a coding process based on grounded theory.
Sustainable supply chain management is a topical area which is continuing to grow and evolve. Within supply chains, downstream distribution from producers to customers plays a significant role in the environmental performance of production supply chains. With consumer consciousness growing in the area of sustainable food supply, food distribution needs to embrace and adapt to improve its environmental performance, while still remaining economically competitive. With a particular focus on the dairy industry, a robust solution approach is presented for the design of a capacitated distribution network for a two-layer supply chain involved in the distribution of milk in Ireland. In particular the green multi-objective optimisation model minimises CO emissions from transportation and total costs in the distribution chain. These distribution channels are analysed to ensure the non-dominated solutions are distributed along the Pareto fronts. A multi-attribute decision-making approach, TOPSIS, has been used to rank the realistic feasible transportation routes resulting from the trade-offs between total costs and CO emissions. The refined realistic solution space allows the decision-makers to geographically locate the sustainable transportation routes. In addition to geographical mapping the decision maker is also presented with a number of alternative analysed scenarios which forcibly open closed distribution routes to build resiliency into the solution approach. In terms of model performance, three separate GA based optimisers have been evaluated and reported upon. In the case presented NSGA-II was found to outperform its counterparts of MOGA-II and HYBRID.
This paper provides a review of stochastic Data Envelopment Analysis (DEA). We discuss extensions of deterministic DEA in three directions: (i) deviations from the deterministic frontier are modeled as stochastic variables, (ii) random noise in terms of measurement errors, sample noise, and specification errors is made an integral part of the model, and (iii) the frontier is stochastic as is the underlying Production Possibility Set (PPS). Stochastic DEA utilizes non-parametric convex or conical hull reference technologies based upon axioms from production theory accompanied by a statistical foundation in terms of axioms from statistics or distributional assumptions. The approaches allow for an estimation of stochastic inefficiency compared to a deterministic or a stochastic PPS and for statistical inference while maintaining an axiomatic foundation. Focus is on bridges and differences between approaches within the field of Stochastic DEA including semi-parametric Stochastic Frontier Analysis (SFA) and Chance Constrained DEA (CCDEA). We argue that statistical inference based upon homogenous bootstrapping in contrast to a management science approach imposes a restrictive structure on inefficiency, which may not facilitate the communication of results of the analysis to decision makers. Semi-parametric SFA and CCDEA differ w.r.t. the modeling of noise and stochastic inefficiency. The two approaches are in spite of the inherent differences shown to be complements in the sense that the stochastic PPSs obtained by the two approaches share basic similarities in the case of one output and multiple inputs. Recent contributions related to (i) disentangling of random noise and random inefficiency and (ii) obtaining smooth shape constrained estimators of the frontier are discussed.
This paper provides a sketch of some of the major research thrusts in data envelopment analysis (DEA) over the three decades since the appearance of the seminal work of [Charnes, A., Cooper, W.W., Rhodes, E.L., 1978. Measuring the efficiency of decision making units. European Journal of Operational Research 2, 429–444]. The focus herein is primarily on methodological developments, and in no manner does the paper address the many excellent applications that have appeared during that period. Specifically, attention is primarily paid to (1) the various models for measuring efficiency, (2) approaches to incorporating restrictions on multipliers, (3) considerations regarding the status of variables, and (4) modeling of data variation.
Purpose - Indirect or mediated effects constitute a type of relationship between constructs that often occurs in partial least squares (PLS) path modeling. Over the past few years, the methods for testing mediation have become more sophisticated. However, many researchers continue to use outdated methods to test mediating effects in PLS, which can lead to erroneous results. One reason for the use of outdated methods or even the lack of their use altogether is that no systematic tutorials on PLS exist that draw on the newest statistical findings. The paper aims to discuss these issues. Design/methodology/approach - This study illustrates the state-of-the-art use of mediation analysis in the context of PLS-structural equation modeling (SEM). Findings - This study facilitates the adoption of modern procedures in PLS-SEM by challenging the conventional approach to mediation analysis and providing more accurate alternatives. In addition, the authors propose a decision tree and classification of mediation effects. Originality/value - The recommended approach offers a wide range of testing options (e.g. multiple mediators) that go beyond simple mediation analysis alternatives, helping researchers discuss their studies in a more accurate way.
Understanding emotions is an important aspect of personal development and growth, and as such it is a key tile for the emulation of human intelligence. Besides being important for the advancement of AI, emotion processing is also important for the closely related task of polarity detection. The opportunity to automatically capture the general public's sentiments about social events, political movements, marketing campaigns, and product preferences has raised interest in both the scientific community, for the exciting open challenges, and the business world, for the remarkable fallouts in marketing and financial market prediction. This has led to the emerging fields of affective computing and sentiment analysis, which leverage human-computer interaction, information retrieval, and multimodal signal processing for distilling people's sentiments from the ever-growing amount of online social data.
Consumer-generated content has provided an important new information medium for tourists, throughout the purchasing lifecycle, transforming the way that visitors evaluate, select and share experiences about tourism. Research in this area has largely focused on quantitative ratings provided on websites. However, advanced techniques for linguistic analysis provide the opportunity to extract meaning from the valuable comments provided by visitors. In this paper, we identify the key dimensions of customer service voiced by hotel visitors use a data mining approach, latent dirichlet analysis (LDA). The big data set includes 266,544 online reviews for 25,670 hotels located in 16 countries. LDA uncovers 19 controllable dimensions that are key for hotels to manage their interactions with visitors. We also find differences according to demographic segments. Perceptual mapping further identifies the most important dimensions according to the star-rating of hotels. We conclude with the implications of our study for future research and practice.
This article explores the domain of international entrepreneurship (IE) research by thematically mapping and assessing the intellectual territory of the field. Extant reviews show that the body of IE knowledge is growing, and while notable contributions towards theoretical and methodological integration are evident, the field is described as phenomenally based, potentially fragmented and suffering from theoretical paucity. Premising that IE is positioned at the nexus of internationalization and entrepreneurship where entrepreneurial behavior involves cross-border business activity, or is compared across countries, we identify 323 relevant journal articles published in the period 1989–2009. We inventory the domain of IE to provide a relevant and comprehensive organization of its research. This involves examining the subject matter of IE research, and inductively synthesizing and categorizing it into major themes and sub-themes. In so doing, we offer a reliable, ontologically constructed and practically useful resource. From this base, we discuss the phenomena, issues, inconsistencies and interim debates on which new theory in IE may be built and research may be conducted. We conclude that IE has several coherent thematic areas and is rich in potential for future research and theory development. ► We review, thematically map and assess the intellectual territory of IE research. ► We construct a comprehensive ontological inventory of IE research (1989–2009). ► We provide a fully documented methodology for future replication. ► Criticism of IE as fragmented and lacking in unifying paradigms is premature. ► We conclude that IE is diverse and growing in coherence and is rich in theoretical potential.
This paper analyses innovation paths and the innovation performance of low-technology firms in comparison to medium- and high-technology firms. Firstly, it shows that low-, medium- and high-technology sectors consist of a considerable mix of low-, medium- and high-technology firms. Thus, it is necessary to look at the firm level when analysing how innovation patterns differ depending on the level of R&D intensity. Secondly, the product and process innovation performance of low-technology firms in German industry is analysed based on data from 1663 firms in the German Manufacturing Survey 2006, applying a set of both product and process related innovation output indicators. The empirical results show that low-technology manufacturing firms lag behind their medium- and high-tech counterparts regarding their product and service innovation performance, to a large degree on purely definitional grounds, but that they seem to perform equally well and in some respects even better at process innovation.