Publications - University of Stuttgart, TIK (formerly RUS)

This list is also available without abstracts.

2016

Sebastian Kiesel and Reinaldo Penno. Port Control Protocol (PCP) Anycast Addresses. RFC 7723, RFC Editor, January 2016. [ bib | external link DOI | external link .txt | external link www ]

The Port Control Protocol (PCP) anycast addresses enable PCP clients to transmit signaling messages to their closest PCP-aware on-path NAT, firewall, or other middlebox without having to learn the IP address of that middlebox via some external channel. This document establishes one well-known IPv4 address and one well-known IPv6 address to be used as PCP anycast addresses.

2015

Thomas Richter and Jan Vanvinkenroye. Automatic Correction of Programming Exercises in ViPLab. In Proc. of IEEE Symposium on Multimedia (ISM), Miami, December 2015. IEEE. [ bib ]

Thomas Richter. Automatische Aufgabenkorrektur mit ViPLab. In Proceedings of the second Workshop "Automatische Bewertung von Programmieraufgaben", Wolfenbüttel, Germany, November 2015. Lecture Notes in Informatics (LNI), Geschellschaft für Informatik, Bonn 2015. [ bib | external link pdf | external link www ]

ViPLab ist ein ILIAS-Plugin zur Durchführung von Programmieraufgaben mit C, C++, Java, Matlab und DuMux im Browser. Dieser Artikel beschreibt eine neue Komponente der ViPLab 2.0 Architektur, die eine automatische Bewertung von studentischen Lösungen erlaubt und so insbesondere bei der Durchführung von Grundlagenkursen eine erhebliche Erleichterung für das Lehrpersonal ermöglicht.

Thomas Richter, Tim Bruylants, Peter Schelkens, and Touradj Ebrahimi. The JPEG XT suite of standards: status and future plans. In Proc. SPIE 9599, Applications of Digital Image Processing, San Diego, September 2015. SPIE. [ bib | external link DOI | external link pdf | external link www ]

The JPEG standard has known an enormous market adoption. Daily, billions of pictures are created, stored and exchanged in this format. The JPEG committee acknowledges this success and spends continued efforts in maintaining and expanding the standard specifications. JPEG XT is a standardization effort targeting the extension of the JPEG features by enabling support for high dynamic range imaging, lossless and near-lossless coding, and alpha channel coding, while also guaranteeing backward and forward compatibility with the JPEG legacy format. This paper gives an overview of the current status of the JPEG XT standards suite. It discusses the JPEG legacy specification, and details how higher dynamic range support is facilitated both for integer and floating-point color representations. The paper shows how JPEG XT's support for lossless and near-lossless coding of low and high dynamic range images is achieved in combination with backward compatibility to JPEG legacy. In addition, the extensible boxed-based JPEG XT file format on which all following and future extensions of JPEG will be based is introduced. This paper also details how the lossy and lossless representations of alpha channels are supported to allow coding transparency information and arbitrarily shaped images. Finally, we conclude by giving prospects on upcoming JPEG standardization initiative JPEG Privacy & Security, and a number of other possible extensions in JPEG XT.

Thomas Richter. Lossless Coding Extensions for JPEG. In Data Compression Conference (DCC), 2015, Snowbird, UT, April 2015. IEEE, IEEE. [ bib | external link DOI | external link www ]

The issue of backwards compatible image and video coding gained some attention in both MPEG and JPEG, let it be as extension for HEVC, let it be as the JPEG XT standardization initiative of the SC29WG1 committee. The coding systems work all on the principle of a base layer operating in the low-dynamic range regime, using a tone-mapped version of the HDR material as input, and an extension layer invisible to legacy applications. The extension layer allows implementations conforming to the full standard to reconstruct the original image in the high-dynamic range regime. What is also common to all approaches is the rate-allocation problem: How can one split the rate between base and extension layer to ensure optimal coding? In this work, an explicit answer is derived for a simplified model of a two-layer compression system in the high bit-rate approximation. For a HDR to LDR tone mapping that approximates the well-known sRGB non-linearity of ? = 2.4 and a Laplacian probability density function, explicit results in the form of the Lambert-W-function are derived. The theoretical results are then verified in experiments using a JPEG XT demo implementation.

David Boehringer. eLearning infrastructures for co-operative degree programmes in Europe. In Global Engineering Education Conference (EDUCON), 2015 IEEE, pages 73-76, Tallinn, March 2015. [ bib | external link DOI | external link www ]

This article discusses the challenges of co-operative degree programmes for the eLearning infrastructures of universities. Existing interfaces of Learning Management Systems are insufficient to support the required workflows, and the handling of the virtual identities of the students is all but trivial. Existing solutions are compared and the chosen solution, the CampusConnect infrastructure, explained in general and for the special case of the cross-European degree programmes of KIC InnoEnergy.

2014

Sebastian Kiesel, Martin Stiemerling, Nico Schwan, Michael Scharf, and Haibin Song. Application-Layer Traffic Optimization (ALTO) Server Discovery. RFC 7286, RFC Editor, November 2014. [ bib | external link DOI | external link .txt | external link www ]

The goal of Application-Layer Traffic Optimization (ALTO) is to provide guidance to applications that have to select one or several hosts from a set of candidates capable of providing a desired resource. ALTO is realized by a client-server protocol. Before an ALTO client can ask for guidance, it needs to discover one or more ALTO servers.

This document specifies a procedure for resource-consumer-initiated ALTO server discovery, which can be used if the ALTO client is embedded in the resource consumer.

Richard Alimi (Editor), Reinaldo Penno (Editor), Y. Richard Yang (Editor), Sebastian Kiesel, Stefano Previdi, Wendy Roome, Stanislav Shalunov, and Richard Woundy. Application-Layer Traffic Optimization (ALTO) Protocol. RFC 7285, RFC Editor, September 2014. [ bib | external link DOI | external link .txt | external link www ]

Applications using the Internet already have access to some topology information of Internet Service Provider (ISP) networks. For example, views to Internet routing tables at Looking Glass servers are available and can be practically downloaded to many network application clients. What is missing is knowledge of the underlying network topologies from the point of view of ISPs. In other words, what an ISP prefers in terms of traffic optimization - and a way to distribute it.

The Application-Layer Traffic Optimization (ALTO) services defined in this document provide network information (e.g., basic network location structure and preferences of network paths) with the goal of modifying network resource consumption patterns while maintaining or improving application performance. The basic information of ALTO is based on abstract maps of a network. These maps provide a simplified view, yet enough information about a network for applications to effectively utilize them. Additional services are built on top of the maps.

This document describes a protocol implementing the ALTO services. Although the ALTO services would primarily be provided by ISPs, other entities, such as content service providers, could also provide ALTO services. Applications that could use the ALTO services are those that have a choice to which end points to connect. Examples of such applications are peer-to-peer (P2P) and content delivery networks.

D. Boehringer and H. Bernlohr. CampusConnect: An open-source initiative to connect Learning Management Systems. In Global Engineering Education Conference (EDUCON), 2014 IEEE, pages 134-141, April 2014. [ bib | external link DOI | external link www ]

CampusConnect is an initiative of twelve universities in the Southwest of Germany to connect their Learning Management Systems (LMS) to support their cooperative degree programmes and courses. The technical infrastructure of CampusConnect allows the possibility to publish course information across the LMSs of different universities (hence showing the students which courses are supported by electronic learning resources at other universities). It leads the students to these courses (simply by links), and gives them access to these courses by offering a token-based authentication. These issues, without CampusConnect time-consuming for students and IT staff alike, are handled with an MOM (Message Oriented Middleware) based architecture with the “E-LearningCommunity-Server” (ECS) as middleware between the LMSs of the involved universities. The ECS is responsible for the information and message routing between the systems. As a technological requirement, the LMSs (and other systems as well) get their own connectors that support the web service interface of the ECS. In use since 2008 with the LMS ILIAS, the architecture has recently been extended to support the LMSs Moodle and Stud.IP and to connect Campus Management Systems. The article describes the supported scenarios, the design, and the design principles of the architecture of CampusConnect.

T. Richter and D. Boehringer. Towards electronic exams in undergraduate engineering. In Global Engineering Education Conference (EDUCON), 2014 IEEE, pages 196-201, Istanbul, April 2014. IEEE, IEEE. [ bib | external link DOI | external link www ]

In this work, we describe the first step towards electronic exams at the University of Stuttgart. The motivation, both driven by students and lecturers, is to offer assessments that are closer to real-life programming by allowing interactive programming and debugging not possible with pen and paper exams. This is in so far important as skills in computer algebra systems as for example MATLAB or programming languages like Java are an elementary ingredient of any engineering study nowadays. In this paper, we describe the technological basis tor computer based exams based on a system already successfully deployed for homework assignments, and describe its transition to an electronic assessment system. We discuss not only the technological requirements on software, its architecture and our choices for hardware, but also provide a brief introduction into the organizational and legal challenges that we are going to overcome.

Thomas Richter. Rate Allocation in a Two Quantizer Coding System. In Data Compression Conference (DCC), 2014, pages 83 - 92, Snowbird, UT, March 2014. IEEE, IEEE. [ bib | external link DOI | external link www ]

The issue of backwards compatible image and video coding gained some attention in both MPEG and JPEG, let it be as extension for HEVC, let it be as the JPEG XT standardization initiative of the SC29WG1 committee. The coding systems work all on the principle of a base layer, perating in the low-dynamic range regime, using a one-mapped version of the HDR material as input, and an extension layer invisible to legacy applications. The extension layer allows implementations conforming to the full standard to reconstruct the original image in the high-dynamic range regime. What is also common to all approaches is the rate-allocation problem: How can one split the rate between base and extension layer to ensure optimal coding? In this work, an explicit answer is derived for a simplified model of a two-layer compression system in the high bit-rate approximation. For a HDR to LDR tone mapping that approximates the well-known sRGB non-linearity of gamma = 2.4 and a Laplacian probability density function, explicit results in the form of the Lambert-W-function are derived. The theoretical results are then verified in experiments using a JPEG XT demo implementation.

Thomas Richter, David Boehringer, and Pascal Grube. Integrating an Online Programming Lab into ILIAS. In 11th International Conference on Remote Engineering and Virtual Instrumentation (REV), pages 31-34, Porto, Portugal, February 2014. IEEE, IEEE. [ bib | external link DOI | external link www ]

Numerical Mathematics and knowledge of elementary computer algorithms are essential parts of any engineering education. While our university provided access to computer programming languages and computer algebra systems, their installation on the students home computer often turned out to be a challenge and solving installation problems required an essential time of any freshmen course. To overcome these problems, the University of Stuttgart developed an online web-based programming tool, the VipLab, which allows hosting manifold computer programming systems from C to MATLAB while providing a simple online development environment to students. In the past, this programming lab was based on SCORM modules, following our own conventions defined in the LiLa project. However, SCORM has a couple of limitations, too, and the overall weak integration requires a needlessly complicated workflow for both lecturers creating exercises and students handing in homework solutions. In this work, we describe these limitations and how to overcome them by a solution specific to the Learning Management System (LMS) used at the University of Stuttgart.

D. Boehringer, B. May, and A. Jokiaho. CampusConnect - ein baden-württembergisches Projekt, volume 9 of TRANFER Ludwigsburger Hochschulschriften, pages 247-268. RabenStück-Verlag, Berlin, 2014. [ bib ]

2013

T. Richter. On the Standardization of the JPEG XT Image Compression. In Picture Coding Symposium (PCS), 2013, pages 37 - 40, San Jose, CA, December 2013. IEEE, IEEE. [ bib | external link DOI | external link www ]

The ISO recently started a standardization initiative on a forwards-compatible extension of its popular JPEG (ISO/IEC 10918-1) standard. This new standard aims at carefully extending the feature set of the existing technology while preserving the established tool chain built around its 20 years old predecessor. In this work, an overview on JPEG XT, its goals and the underlying technology will be given, and two of the proposed coding technologies for JPEG XT are compared side by side.

T Richter and S. Simon. Coding Strategies and Performance Analysis of GPU Accelerated Image Compression. In Picture Coding Symposium (PCS), 2013, pages 125-128, San Jose, CA, December 2013. IEEE, IEEE. [ bib | external link DOI | external link www ]

Graphics Processing Units (GPUs) are freely programmable massively parallel general purpose processing units and thus offer the opportunity to off-load heavy computations from the CPU to the GPU. One application for GPU programming is image compression, where the massively parallel nature of GPUs promises high speed benefits. However, measurements with competative highly optimized CPU implementations show that GPU based codes are usually not considerably faster, or perform only with less than ideal rate-distortion performance. This article presents the predicaments of data-parallel image coding by first presenting a series of theoretical arguments that limit the performance of such implementations before advancing to existing GPU implementations demonstrating the challenges of parallel image coding. It will be argued and seen on experiments that either parts of the entropy coding and bitstream build-up must remain serial, or rate-distortion penalties must be paid when offloading all computations on the GPU.

J. Vanvinkenroye, C. Gruninger, C.-J. Heine, and T. Richter. A Quantitative Analysis of a Virtual Programming Lab. In IEEE International Symposium on Multimedia (ISM 2103), pages 457 - 461, Anaheim, California, USA, December 2013. IEEE, IEEE. [ bib | external link DOI | external link www ]

We implemented a survey with one learning group using the web-based tools and a control group working with a traditional setup based on editor and compiler. In a recent publication, we described the design and implementation of a web-based programming lab (ViPLab) targeted at undergraduate Engineering and Mathematics courses. This work provides a quantitative analysis of the user feedback, experience and learning success. The survey shows that web-based installations are as efficient as classical tools, while Windows users prefer the web-based chain over the editor/compiler installation on Linux. This justifies the use of web-based installations in programming beginner courses, if the learning target focuses on programming and not a particular tool chain.

David J. Lutz and Burkhard Stiller. A survey of payment approaches for identity federations in focus of the SAML technology. IEEE Communications Surveys and Tutorials, 15(4):1979-1999, October 2013. [ bib | external link DOI | external link www ]

Identity Federations are increasingly being used to establish convenient and secure attribute-based authentication and authorization systems. Whilst this process began mainly in the academic sector, it is assumed that over the next few years more and more commercial Service Providers will join Identity Federations in order to offer their services and products to federated customers. However, the introduction of commercial Service Providers demands a solution for payment, which has not been deployed during the early years of Identity Federations. Thus, Service Providers have to implement not only the federation application, but also additional payment solutions; a problem, by which the federation may appear unattractive for Service Providers, especially semi-commercial or those requiring micropayments. Even for large commercial providers entering a federation, the lack of payment support is a major disadvantage that may lead to either customer or profit loss. Thus, although a combination of electronic Payment solutions and Identity Federation approaches would provide several benefits to its participants, there has not been much investigation of such combinations. Therefore, this paper analyses electronic payment approaches as well as Identity Federation mechanisms and focuses on a solution to bridge these two aspects. Besides early stages of identity-based payments, final full integrated SAML-based payment approaches, which merge payments and Identity Federation into a powerful business solution, are also highlighted. However, since security is a major concern when focusing on payment solutions, several approaches have been investigated, including security and privacy evaluations, and, within this survey, only those solutions providing a sufficient level of security and privacy have been taken into consideration.

T. Richter and S. Simon. Comparison of CPU and GPU Based Coding on Low-Complexity Algorithms for Display Signals. In Andrew G. Tescher, editor, Applications of Digital Image Processing XXXVI, volume 8856, page 14 pages. SPIE, SPIE, October 2013. [ bib | external link DOI | external link www ]

Graphics Processing Units (GPUs) are freely programmable massively parallel general purpose processing units and thus offer the opportunity to off-load heavy computations from the CPU to the GPU. One application for GPU programming is image compression, where the massively parallel nature of GPUs promises high speed benefits. This article analyzes the predicaments of data-parallel image coding on the example of two high-throughput coding algorithms. The codecs discussed here were designed to answer a call from the Video Electronics Standards Association (VESA), and require only minimal buffering at encoder and decoder side while avoiding any pixel-based feedback loops limiting the operating frequency of hardware implementations. Comparing CPU and GPU implementations of the codes show that GPU based codes are usually not considerably faster, or perform only with less than ideal rate-distortion performance. Analyzing the details of this result provides theoretical evidence that for any coding engine either parts of the entropy coding and bit-stream build-up must remain serial, or rate-distortion penalties must be paid when offloading all computations on the GPU.

Christoph Demont, Uwe Breitenbücher, Oliver Kopp, Frank Leymann, and Johannes Wettinger. Towards Integrating TOSCA and ITIL. In Oliver Kopp and Niels Lohmann, editors, Proceedings of the 5th Central-European Workshop on Services and their Composition (ZEUS 2013), volume 1029 of CEUR Workshop Proceedings, pages 28-31. CEUR-WS.org, September 2013. [ bib | www ]

The integration of low level management functionalities provided by TOSCA and high level processes as defined by ITIL may provide significant improvement opportunities to the application provider as on both levels workflow technology can be employed. In this paper, we present Key Performance Indicator Analysis Plans as first idea how both approaches can be integrated.

Thomas Richter. A Global Image Fidelity Metric: Visual Distance and its Properties. In IEEE International Conference on Image Processing ICIP 2013, pages 369 - 373, Melbourne, Victoria, Australia, September 2013. IEEE, IEEE. [ bib | external link DOI | external link www ]

The purpose of full reference image quality indices like Mean Square Error (MSE) or SSIM is to predict the judgement of human observers in subjective quality assessment tasks. More advanced indices like SSIM or VIF are, however, rarely metrics in the strict sense, i.e. they don't define a distance between pairs of images that would describe how far these images are related. It was shown in a recent work, however, that SSIM can be "integrated" to such a global metric, in the following called Visual Distance, which behaves locally like SSIM, but globally like a distance in a curved space. It was also seen that the Visual Distance between two images can be interpreted as the number of almost invisible image deformations to transform one image into another. In this work, properties of Visual Distances will be discussed; these results will allow to extend the result from SSIM to its multi-scale variant MS-SSIM. It will also seen that human judgement will typically not define a metric, but it is conjectured that scores and Visual Distances are related by a monotonie Judgement Function. If so, it will be seen that the underlying Visual Distance can always be reconstructed from the scores up to a proportionality factor defined by the scale of the score.

Th. Richter, Z. Wang, S. Simon, M. Klaiber, and S. Ahmed. SSPQ - spatial domain perceptual image codec based on subsampling and perceptual quantization. In 2012 IEEE International Conference on Image Processing ICIP 2012, pages 1061 - 1064, Lake Buena Vista, Florida, USA, September 2013. IEEE, IEEE. [ bib | external link DOI | external link www ]

A spatial domain perceptual image codec based on subsampling and perceptual quantization (SSPQ) guided by the just-noticeable distortion (JND) profile is proposed. SSPQ integrates perceptual image coding and progressive transmission in one framework. The input image is first subsampled by a factor of two in both dimensions and the subsampled image is compressed without loss. The subsampled image provides a basis for both predicting the input pixels by interpolation and estimating the JND values for each pixel. Residual quantization thresholds are set to the estimated JND values for a perceptually tuned compression. Quantized residuals are progressively encoded by a context-based Golomb coder with run-length coding capacity. Experimental results show over 50% improvement in compression performance on average for the proposed SSPQ codec compared to the lossless JPEG-LS.

M. Scharf and S. Kiesel. Quantifying Head-of-Line Blocking in TCP and SCTP. Internet-Draft draft-scharf-tcpm-reordering-00.txt, IETF Secretariat, July 2013. [ bib | external link www ]

In order to quantify the impact of head-of-line blocking on application latencies, this memo provides simple analytical models for a "back of the envelop" estimation of the delay impact for reliable transport over a single TCP connection, multiple TCP connections, multiple SCTP streams, and unordered transport.

Thomas Richter. Backwards Compatible Coding of High Dynamic Range Images with JPEG. In Data Compression Conference (DCC), 2013, pages 153-160, Snowbird, UT, March 2013. IEEE, IEEE. [ bib | external link DOI | external link www ]

In its Paris meeting, the JPEG committee decided to work on a backwards compatible extension of the popular JPEG (10918-1) standard enabling lossy and lossless coding of high-dynamic range (HDR) images, the new standard shall allow legacy applications to decompress new code streams into a tone mapped version of the HDR image while codecs aware of the extensions will decompress the stream with full dynamic range. This paper proposes a set of extensions that have rather low implementation complexity, and use - whenever possible - functional design blocks already present in 10918-1. It is seen that, despite its simplicity, the proposed extension performs close to JPEG 2000 (15444-2) and JPEG XR (29199-2) on the HDR test image set of the JPEG for high bit-rates.

Thomas Richter. High Throughput Coding of Video Signals. In Data Compression Conference (DCC), page 517, Snowbird, UT, March 2013. IEEE, IEEE. [ bib | external link DOI | external link www ]

As the resolution of monitors and TVs continue to increase, the available bandwidth between host system and monitor becomes more and more a bottleneck. The Video Electronics Standards Association (VESA) is currently developing standards for screen resolutions beyond 4K and, facing the problem of not having enough bandwidth available on traditional copper wires, contacted the JPEG to develop a low complexity, high-throughput still image coder for lossy transmission of video signals. This article describes two approaches to address this predicament, a simple SPIHT based coded and a Hadamard based embedded codec requiring only minimal buffering at encoder and decoder side, and avoiding any pixel-based feedback loops limiting the operating frequency of hardware implementations. Analyzing the details of both implementations reveals an interesting connection between run-length coding, as found in the progressive mode of traditional JPEG coding, and SPIHT/EZW coding - a technique popular in wavelet based compression techniques.

Alexander Vensmer and Sören Berger. DynFire - Verteilte Firewalls in heterogenen Netzwerken. 20. DFN Workshop "Sicherheit in vernetzten Systemen", February 2013. [ bib ]

Dieser Beitrag präsentiert "DynFire", einen neuen Ansatz um Firewalls dynamisch und benutzerspezifisch zu konfigurieren. Durch DynFire kann in einem Netz kontrolliert werden, welche Nutzer zu welchen Diensten Verbindungen aufbauen dürfen. Dies wird durch neukonfigurieren der sich im Netz befindlichen Firewalls realisiert. Dabei ist grundlegende Annahme von DynFire, dass in einem vom Internet getrennten und streng kontrolliertem Netzwerk stets eine Zuordnung zwischen IP Adresse und Benutzerkennung existiert. Immer wenn sich ein Benutzer authentifiziert, werden die entsprechenden Firewalls freigeschaltet bzw. geschlossen. Dies wird mit einer zentralen Firewall-Manager Instanz und standardisierten Signalisierungsprotokollen realisiert.

2012

Th. Richter, P. Grube, and D. Zutin. A Standardized Metadata Set for Annotation of Virtual and Remote Laboratories. In 2012 IEEE International Symposium on Multimedia ISM 2012, pages 451 - 456, Irvine, California, USA, December 2012. IEEE, IEEE. [ bib | external link DOI | external link www ]

Online Laboratories and Virtual Experiments start to play an increasingly important role in the education of Engineering and Science Education. While several repositories for online and virtual experiments are available, a common method for annotating experiments to simplify their discovery is not yet available and accepted. In 2010, an international group of online lab providers formed the Global Online Lab Consortium (GOLC) to address the issues of interoperability between online laboratories and laboratory compilations, one of its activities is the establishment of an ontology and a common metadata set that addresses not only the needs of typical lab providers and lab users, but also of storage and archival institutions such as libraries. This article describes the current status of the GOLC activities in the metadata subcommittee, lists the requirements of various user groups of the metadata set and provides insight into both the underlying ontology and the metadata specifications themselves.

Sören Berger, Alexander Vensmer, and Sebastian Kiesel. An ABAC-based Policy Framework for Dynamic Firewalling. In Proceedings of the The Seventh International Conference on Systems and Networks Communications (ICSNC 2012), pages 118 - 123, Lisbon, Portugal, November 2012. [ bib | external link pdf | external link www ]

This paper presents the Policy Framework of DynFire, a novel approach for attribute-based, dynamic control of network firewalls. DynFire allows an individually controlled, secure access to IT resources of a large organization, with particular focus on mobile users and users with restricted rights, such as subcontractors. The basic assumption behind DynFire is that, within a secured network domain separated from the Internet, a temporary binding between an IP address and a single user ID can be established. Users with different attributes can authenticate to the network and get individual access to network resources. To administrate such a large amount of users and different access rights within a secured network domain of an organization, which includes distributed organisational zones, a policy framework is needed. The following paper presents a policy framework for dynamic and distributed firewalls which is able to grant access control on a per-user basis, with multitenancy capabilities and administrative delegation.

Thomas Richter and Sven Simon. Towards high-speed, low-complexity image coding: Variants and modification of JPEG 2000. In Andrew G. Tescher, editor, Applications of Digital Image Processing XXXV, volume 8499, page 10. SPIE, SPIE, October 2012. [ bib | external link DOI | external link www ]

Recently, the JPEG committee discussed the introduction of an "ultrafast" mode for JPEG 2000 encoding. This considered extension of the JPEG 2000 framework replaces the EBCOT coding by a combined Human- Runlength code, and adds an optional additional prediction step after quantization. While the resulting codec is not compatible with existing JPEG 2000, it still allows lossless transcoding from JPEG 2000 and back, and performance measurements show that it offers nearly the quality of JPEG 2000 and similar quality than JPEG XR at a much lower complexity comparable to the complexity of the IJG JPEG software. This work introduces the extension, and compares its performance with other JPEG standards and other extensions of JPEG 2000 currently under standardization.

Ronald Marx and Sebastian Kiesel. Dynamic firewalling for femto-cell communication. IIT Real-Time Communications Conference & Expo, September 2012. [ bib | external link www ]

Application scenarios for femto cells (aka Home (e)Node Bs) constantly grew since their first specification in 3GPP Release 8. However, the deployment of femto cells is challenging for network operators as femto cells get direct access to the core network via a VPN tunnel. Moreover, the femto cells' integrity cannot be guaranteed because they are operated outside of the operator's domain. Thus, it enormously increases security if the access within the core network is limited by setting user-dependent firewalling rules to allowed services. Configuration effort can be eased by employing a dynamic firewalling approach, which supports setting firewall rules on-demand on a per-user basis.

This talk will present the dynamic firewalling apporach developed in the DynFire project. DynFire allows an individually controlled, secure access to the IT resources of a large organization, with particular focus on mobile users and users with restricted rights.

Sebastian Kiesel (Editor), Stefano Previdi, Martin Stiemerling, Richard Woundy, and Y. Richard Yang. Application-Layer Traffic Optimization (ALTO) Requirements. RFC 6708, RFC Editor, September 2012. [ bib | external link DOI | external link .txt | external link www ]

Many Internet applications are used to access resources, such as pieces of information or server processes that are available in several equivalent replicas on different hosts. This includes, but is not limited to, peer-to-peer file sharing applications. The goal of Application-Layer Traffic Optimization (ALTO) is to provide guidance to applications that have to select one or several hosts from a set of candidates capable of providing a desired resource. This guidance shall be based on parameters that affect performance and efficiency of the data transmission between the hosts, e.g., the topological distance. The ultimate goal is to improve performance or Quality of Experience in the application while reducing the utilization of the underlying network infrastructure.

This document enumerates requirements for specifying, assessing, or comparing protocols and implementations.

Thomas Richter and Sven Simon. On the JPEG 2000 Ultrafast Mode. In 2012 IEEE International Conference on Image Processing ICIP 2012, pages 2501 - 2504, Lake Buena Vista, Florida, USA, September 2012. IEEE, IEEE. [ bib | external link DOI | external link www ]

Recently, the JPEG committee discussed the introduction of an “ultrafast” mode for JPEG 2000 encoding. This considered extension of the JPEG 2000 framework replaces the EBCOT coding by a combined Huffman-Runlength code, and adds an optional additional prediction step after quantization. While the resulting codec is not compatible with existing JPEG 2000, it still allows lossless transcoding from JPEG 2000 and back, and performance measurements show that it offers nearly the quality of JPEG 2000 and similar quality than JPEG XR at a much lower complexity comparable to the complexity of the IJG JPEG software. This work introduces the extension, and compares its performance with other JPEG standards and other extensions of JPEG 2000 currently under standardization.

Z. Wang, M. Klaiber, Y. Gera, S. Simon, and T. Richter. Fast lossless image compression with 2D Golomb parameter adaptation based on JPEG-LS. In Signal Processing Conference (EUSIPCO), 2012 Proceedings of the 20th European, pages 1920-1924, August 2012. [ bib | external link www ]

A Fast and Lossless Image Compression (FLIC) algorithm based on the median edge predictor and Golomb coder of JPEG-LS is presented. FLIC eliminates the gradient-based context model from the JPEG-LS standard, the most expensive parts with respect to computational complexity and memory space requirements. To avoid a large context memory, Golomb parameter is selected based on the coding states and the prediction residuals of up to two immediate neighbors, one in each dimension. The FLIC algorithm has low memory footprint and dissolves the data dependencies in JPEG-LS to facilitate parallelization. Experimental results show that the FLIC algorithm achieves a throughput speedup factor of 3.7 over JPEG-LS with less than 4% compression performance penalty. Lossless compression performance results further show that FLIC outperforms other state-of-the-art standards including JPEG 2000 and JPEG XR.

S. Kiesel and M. Stiemerling. 3rd Party ALTO Server Discovery (3pdisc). Internet-Draft draft-kist-alto-3pdisc-00.txt, IETF Secretariat, July 2012. [ bib | external link www ]

The goal of Application-Layer Traffic Optimization (ALTO) is to provide guidance to applications, which have to select one or several hosts from a set of candidates that are able to provide a desired resource.

Entities seeking guidance need to discover and possibly select an ALTO server to ask. This is called ALTO server discovery. This memo describes an ALTO server discovery mechanism for a 3rd party setting, i.e., where the ALTO client is not co-located with the actual resource consumer.

T Richter, R. Watson, S. Kassavetis, M Kraft, P. Grube, D. Boehringer, P. de Vries, E. Hatzikraniotis, and S. Logothetidis. The WebLabs of the University of Cambridge: A study of securing remote instrumentation. In 9th International Conference on Remote Engineering and Virtual Instrumentation (REV), pages 1-5, Bilbao, Spain, July 2012. IEEE, IEEE. [ bib | external link DOI | external link www ]

Safe deployment of web interfaces for remote instrumentation requires that the laboratory system be protected from harmful manipulation by end users or attacks from malicious software over the internet. Industrial control systems, although highly relevant to contemporary engineering education and an essential component of many remote experiments, are typically only designed to run in a secured local area network and cannot safely be exposed to the internet because they lack a sufficiently robust security infrastructure. They also typically require the installation of proprietary software on the end user system which is an obstacle for deployment in learning scenarios at universities. Facing these challenges when bringing the Chemical Engineering WebLabs at the University of Cambridge online, the Computing Center of the University of Stuttgart and the University of Cambridge developed a framework to allow the secure deployment of industrial controller software in remote learning applications; this framework is generic, has a low-barrier for students as it only requires an internet browser and JavaTM installation, and it satisfies the high security demands of most university infrastructure providers. Furthermore, the framework has the potential to be applied to almost any remote laboratory setup and is compatible with all commonly-used operating systems at the user end.

Alexander Vensmer and Sebastian Kiesel. DynFire: Dynamic Firewalling in Heterogeneous Environments. In Proceedings of the World Congress on Internet Security (WorldCIS-2012), Guleph, Canada, June 2012. [ bib | pdf | www ]

This paper presents “DynFire,” a novel approach for the role-based, dynamic control of network firewalls. DynFire allows an individually controlled, secure access to the IT resources of a large organization, with particular focus on mobile users and users with restricted rights, such as subcontractors. The basic assumption behind DynFire is that, within a secured network domain separated from the Internet, we can establish a temporary binding between an IP address and a single user ID. Whenever a user connects to or disconnects from this secure network domain, firewalls are configured accordingly, using a centralized “Firewall Manager” and standardized signaling protocols.

Zhe Wang, D. Chanda, S. Simon, and T. Richter. Memory efficient lossless compression of image sequences with JPEG-LS and temporal prediction. In Picture Coding Symposium (PCS), 2012, pages 305-308, May 2012. [ bib | external link DOI | external link www ]

In this paper, a lossless encoder for image sequences based on JPEG-LS defined for still images with temporal-extended prediction and context modeling is proposed. As embedded systems are one important field of application of the codec, on-line lossy reference frame compression is used to reduce the encoder's memory requirement. Variations of the pixel values in the reference frame due to lossy compression are acceptable since the predictor provides only estimations of the pixel values being encoded in the current frame. Larger variations decrease the final lossless compression performance of the encoder such that a trade-off between the memory requirement and the overall compression ratio is required. Different compression algorithms for the reference frame, including JPEG, JPEG 2000 and near-lossless JPEG-LS, and their impacts on the memory requirement and the overall lossless compression ratio have been studied. Experimental results show 9.6% or more gain in lossless compression ratio compared to applying the standard JPEG-LS frame-by-frame and 80% reduction in the encoder buffer size compared to storing the uncompressed reference frame.

Yurij Gera, Zhe Wang, Sven Simon, and Thomas Richter. Fast and Context-free Lossless Image Compression Algorithm based on JPEG-LS. In James A. Storer Michael W. Marcellin, editor, Data Compression Conference (DCC), 2012, Snowbird, Utah, USA, April 2012. IEEE, IEEE. [ bib | external link DOI | external link www ]

While the context-based entropy coding and bias cancellation steps in the JPEG-LS standard are key features to its compression performance, these steps also enlarge the memory footprint, create dependencies in the data path of implementations and hence limit parallelism in modern multi-core or GPU architectures, and the throughput in hardware implementations. In the proposed modification of JPEG-LS, such most expensive parts with respect to memory space requirements and computational complexity are omitted.

Thomas Richter. Compressing JPEG 2000 JPIP Cache State Information. In James A. Storer Michael W. Marcellin, editor, Data Compression Conference (DCC), 2012, pages 13-21, Snowbird, Utah, USA, April 2012. IEEE, IEEE. [ bib | external link DOI | external link www ]

JPEG 2000 part 9, or short JPIP, is an interactive image browsing protocol that allows the selective delivery of image regions, components or scales from JPEG 2000 image. Typical applications are browsing tools for medical databases where transmitting huge images from server to client in total would be uneconomical. Instead, JPIP allows extracting only the desired image parts for analysis by an http type request syntax. Such a JPIP connection may either operate in a session within which the server remains aware of the image data already cached at the client and it hence doesn't have to transmit again, or it may operate in a stateless mode in which the server has no model of the data already available on the client. In such cases, the client may include a description of its cache model within a proceeding request to avoid retransmission of data already buffered. Unfortunately, the standard defined methods how such cache models are described are very inefficient, and a single request including a cache model may grow several KBytes large for typical images and requests, making the deployment of a JPIP server on top of existing http server infrastructure rather inconvenient. In this work, a lossy and loss less embedded compression scheme for such JPIP cache model adjustment requests based on a modified zero-tree algorithm is proposed, this algorithm works even in constraint environments where request size must remain limited. The proposed algorithm losslessly compresses such cache model adjustment requests often better than by a factor of 1:8, but may even perform a 1:8000 compression in cases where the cache model has to describe a large number of precincts.

Sebastian Kiesel. VoIPUS: IP-Telefonie für die Universität Stuttgart. 56. Betriebstagung des Vereins zur Förderung eines Deutschen Forschungsnetzes (DFN), March 2012. [ bib | external link pdf ]

M.Stein, K.Oberle, T.Voith, D.Lamp, and S.Berger. Network management in virtualized infrastructures. In Achieving Real-Time in Distributed Computing: From Grids to Clouds: IGI Global, pages 218-235, 2012. [ bib ]

D.Lamp, S.Berger, M.Stein, T.Voith, T.Cucinotta, and M.Bertogna. Execution and resource management in qos-aware virtualized infrastructures. In Achieving Real-Time in Distributed Computing: From Grids to Clouds: IGI Global, pages 200-217, 2012. [ bib ]

Th. Richter, S. Rudlof, D. Boehringer, Ch. Grüninger, R. Helmig, Ch. Rohde, H. Bernlohr, C.-D. Munz, and A. Stock. ViPLab - A Virtual Programming Laboratory for Mathematics and Engineering. In SEFI 40th annual conference, 2012. [ bib | external link www ]

In the process of the implementation of the eBologna program of the European states and the recent change of the German university system from the Diploma to the Bachelor/Master system, studies at German universities have been redesigned; courses have been condensed and learning content has been re-structured into granular modules, each of which requires an evaluation at the end of the semester. Simultaneously, the skills required for working in research and development changed as well; handling of computer software, knowledge of mathematical or numerical algorithms and programming skills play an increasingly important role in the daily job routine of the working engineer. To support learning by practical exercises, engineering faculties, the faculties of mathematics and physics, and the Computing Center of the University of Stuttgart setup a project for implementing an online programming lab for teaching the required skills. The focus of this project is to provide easy access to the necessary software tools, avoid the overhead of installation and maintenance, and seamlessly integrate these tools into the eLearning infrastructure of the university. This paper describes the motivation and backgrounds, the software infrastructure and early results of this project.

Tommaso Cucinotta, Fabio Checconi, George Kousiouris, Kleopatra Konstanteli, Spyridon V. Gogouvitis, Dimosthenis Kyriazis, Theodora A. Varvarigou, Alessandro Mazzetti, Zlatko Zlatev, Juri Papay, Michael Boniface, Sören Berger, Dominik Lamp, Thomas Voith, and Manuel Stein. Virtualised e-learning on the irmos real-time cloud. Service Oriented Computing and Applications, 6(2):151-166, 2012. [ bib ]

2011

Th. Richter, S. Rudlof, B.Adjibadji, H. Bernlohr, C. Gruninger, C. Munz, C. Rohde, and R. Helmig. ViPLab -A Virtual Programming Laboratory for Mathematics and Engineering. In ISM 2011 IEEE International Symposium on Multimedia, pages 537 - 542, Dana Point CA, December 2011. IEEE, IEEE. [ bib | external link DOI | external link www ]

In the process of the implementation of the eBologna program of the European states and the recent change of the German university system from the Diploma to the Bachelor/Master system, studies at German universities have been redesigned, courses have been condensed and learning content has been re-structured into granular "modules", each of which requires an evaluation at the end of the semester. Simultaneously, the skills required for working as an engineer changed as well, handling of computer software, knowledge of mathematical or numerical algorithms and programming skills play an increasingly important role in the daily job routine of the working engineer. To support the learning by practical exercises, engineering faculties, mathematics and physics, and the Computing Center of the University of Stuttgart setup a project for implementing an online programming lab for teaching the required skills. The focus of this project is to provide easy access to the necessary software tools, avoid the overhead of installation and maintenance, and seamlessly integrate these tools into the eLearning infrastructure of the university. This paper describes the motivation and backgrounds, the software infrastructure and early results of this project.

Thomas Richter, Yvonne Tetour, and David Boehringer. Library of Labs: A European Project on the Dissemination of Remote Experiments and Virtual Laboratories. In Bob Werner, editor, IEEE International Symposium on Multimedia (ISM 2011), pages 543 - 548, Dana Point, California, USA, December 2011. IEEE, IEEE. [ bib | external link DOI | external link www ]

In this paper, we provide background information on the EC funded Lila Project (“Library of Labs”), describe its goals and purposes, provide some insight into its software design and provide first experiences, made at the University of Stuttgart using the eLearning content deployed by the project.In this paper, we provide background information on the EC funded Lila Project (“Library of Labs”), describe its goals and purposes, provide some insight into its software design and provide first experiences, made at the University of Stuttgart using the eLearning content deployed by the project.

Y. Tetour, D. Boehringer, and T. Richter. Integration of virtual and remote experiments into undergraduate engineering courses. In Global Online Laboratory Consortium Remote Laboratories Workshop (GOLC), 2011 First, pages 1 - 6, Rapid City, SD, October 2011. IEEE. [ bib | external link DOI | external link www ]

Experiments play an important role in the education of undergraduate engineering students as they provide hands-on experience on the foundations of the discipline. Unfortunately, the recent change of the university program in Germany from the Diploma to the Bachelor/Master model had a direct negative impact on the curricula and the course schedules. The result of which is that the first-year curriculum is overloaded; exercises or practical courses have been dropped altogether from the first term, have been rescheduled to later terms and have been reduced in length. For this reason, other forms of experimenting have to be developed and integrated into the existing courses or lectures; virtual laboratories and remote experiments offer such an option: they enable students to access equipment 24h/7 days a week, they are independent from opening hours and the work schedule of the staff. Furthermore, simulations do have some other advantages: they provide a better control on the simulated phenomenon, allow observing effects and running experiments that are only very hard to measure or perform in practical applications. Another advantage besides their cost-efficiency is that simulations allow observations of effects in a simplified environment without any measurement errors. Therefore, remote and virtual experiments have already or will soon be set up by various universities across Europe. However, building a pool of experiments sufficient to cover all of undergraduate physics is an overwhelmingly complex and costly task for a single university to handle on its own. Therefore, the EU funded LiLa project - short for "Library of Labs" - is building a network of virtual laboratories and remote experiments. The LiLa Portal provides access to manifold experiments, free to use in courses and lectures. Additionally, LiLa partners will profit from the experience gained from using remote experiments, and the LiLa network will provide best-practices in applications of remote experim- - ents. This article starts with an introduction of LiLa and its aims; it then presents the integration of virtual labs and remote experiments into an existing course, "Physics for Engineers", at the University of Stuttgart. We introduce our concept for exercises with present some results of the pilot phase which took place in the winter term 2009/2010 and also some results on the winter term 2010/2011.

Thomas Richter. EEM quantization revisited: asymptotic optimality for variable rate coding. In Andrew G. Tescher, editor, Applications of Digital Image Processing XXXIV, volume 8135. SPIE, SPIE, September 2011. [ bib | external link DOI | external link www ]

Equal-Expectation Magnitude Quantization (EEM) aims at minimizing the distortion of a quantizer with defined reconstruction points by shifting the deadzone parameter such that the expectation value of the signal equals the reconstructed value. While intuitively clear, this argument is not sufficient to prove rate-distortion optimality. In this work, it is show that the EEM quantizer is rate-distortion optimal up to third order in an expansion in powers of the quantization bucket size in the high-bitrate appoximation, and the approximating series for the optimal quantizer is computed. This result is compared to an even simpler quantization strategy based on the LLoyd-Max quantizer which selectively sets coefficients to zero. It is shown that both strategies lead to the same asymptotic expansion for the threshold parameter, but zeroing coefficients provides optimality in one additional order in the quantization bucket size.

Thomas Richter. SSIM as Global Quality Metric: A Differential Geometry View. In Third International Workshop on Quality of Multimedia Expierence (QoMEX 2011), pages 189 - 194, Mechelen, September 2011. IEEE, IEEE. [ bib | external link DOI | external link www ]

While traditional image quality metrics like MSE are mathematically well understood and tractable, they are known to correlate weakly to image distortion as observed by human observers. To address this situation, many full reference quality indices have been suggested over the years that correlate better to human perception, one of them being the well-known Structural Similarity Index by Wang and Bovik. However, while these expressions show higher correlations, they are often not very tractable mathematically, and - in specific - are rarely metrics in the strict mathematical sense. Specifically, the triangle inequality is often not satisfied, which could either be seen as an effect of the human visual system being unable to compare images that are visually too different, or as a defect of the index capturing the global situation correctly. In this article, the latter position is taken, and it is shown how the SSIM can be understood as a local approximation of a global metric, namely the geodesic distance on a manifold. While the metric cannot be computed explicitly in most cases, it is nevertheless shown that in specific cases its expression is identical to Weber's Law of luminance sensitivity of the human eye.

A. Gallardo, T. Richter, P. Debicki, L. Bellido, V. Mateos, and V. Villagra. A rig booking system for on-line laboratories. In Global Engineering Education Conference (EDUCON), 2011 IEEE, pages 643-648, April 2011. [ bib | external link DOI | external link www ]

Recently, many educational institutions have acknowledged the importance of making laboratories available on-line, allowing their students to run experiments from a remote computer. While usage of virtual laboratories scales well, remote experiments, based on scarce and expensive rigs, i.e. physical resources, do not and typically can only be used by one person or cooperating group at a time. It is therefore necessary to administer the access to rigs, where we distinguish between three different roles: content providers, teachers and students. This paper reports on a conceptual model and technical design of a rig booking system that provides mechanisms for content providers and teachers to control and grant access to on-line remote laboratories. The design of the booking system is based on a requirements analysis carried out by the EC funded LiLa project in cooperation with international partners from the Global Online Lab Consortium, GoLC.

P.P. Grube, D. Boehringer, T. Richter, C. Spiecker, N. Natho, C. Maier, and D. Zutin. A metadata model for online laboratories. In Global Engineering Education Conference (EDUCON), 2011 IEEE, pages 618 - 622, Amman, April 2011. IEEE, IEEE. [ bib | external link DOI | external link www ]

Making online laboratories available to the wider public requires them to be retrievable and reusable. To this end, we define a metadata set providing all required information in a machine readable form. This article presents the metadata set currently under discussion by the Global Online Lab Consortium (GOLC) as well as the issues of defining widely acceptable controlled vocabularies to describe the scientific field of remote experimentation.

Thomas Richter. Deadzone Based Rate Allocation for JPEG XR. In Data Compression Conference (DCC), 2011, page 474, Snowbird, UT, March 2011. IEEE, IEEE. [ bib | external link DOI | external link www ]

Similar to the JPEG image compression standard, the JPEG XR image compression solely controls the image quality loss and hence the output rate by means of the quantizer bucket sizes; a precise rate control mechanism like the EBCOT rate allocation algorithm in JPEG 2000 is not specified, and hence rate-distortion optimality of the quantizer is, in general, not given.

In this work, a simple rate-control mechanism for JPEG XR is introduced that allows an efficient control of the quantizer towards rate-distortion optimality. One possibility to implement this quantizer control would be to use the spatial variable quantization feature of JPEG XR, but it was seen in an earlier work that the additional side information required to transmit the quantization setting almost compensates the PSNR gain of variable quantization and complicates the rate allocation process by requiring an additional quantizer allocation step.

However, while JPEG XR defines the image reconstruction process completely, an encoder still has the freedom to select the deadzone size of the quantizer; this mechanism has the additional advantage that no additional side information needs to be transmitted and that the deadzone size is not, unlike the quantizer bucket size, constrained to a set of pre-defined values specified in the standard.

It is found that the image quality of JPEG XR can be improved by about 0.2 to 0.4 dB by performing a rate-distortion optimal selection of the deadzone; this gain is seen to be comparable to the PSNR loss of a JPEG 2000 codec where, for experimental reasons, EBCOT rate control has been turned off.

M. Stiemerling and S. Kiesel. ALTO Deployment Considerations. Internet-Draft draft-ietf-alto-deployments-00.txt, IETF Secretariat, February 2011. [ bib | external link www ]

Many Internet applications are used to access resources, such as pieces of information or server processes, which are available in several equivalent replicas on different hosts. This includes, but is not limited to, peer-to-peer file sharing applications. The goal of Application-Layer Traffic Optimization (ALTO) is to provide guidance to these applications, which have to select one or several hosts from a set of candidates, that are able to provide a desired resource. The protocol is under specification in the ALTO working group. This memo discusses deployment related issues of ALTO for peer-to-peer and CDNs, some preliminary security considerations, and also initial guidance for application designers using ALTO.

David J. Lutz. Bridging between SAML-based payment and other identity federation payment systems. In Digital Enterprise and Information Systems - International Conference, DEIS 2011, London, UK, July 20 - 22, 2011. Proceedings, pages 172-186, 2011. [ bib | external link DOI | external link www ]

David J. Lutz. Payment processes for identity federations: the SAML-based payment approach. PhD thesis, University of Zurich, 2011. [ bib ]

Veronica Mateos, Alberto Gallardo, Thomas Richter, Luis Bellido, Peter Debicki, and Dr. Víctor A. Villagrá. LiLa Booking System: Architecture and Conceptual Model of a Rig Booking System for On-Line Laboratories. In International Journal of Online Engineering (iJOE), volume 7, pages 26-35. International Association of Online Engineering, 2011. [ bib | external link www ]

Many educational institutions acknowledged the importance of providing online-access to student laboratories. To optimize the use of the scarce and expensive resources such laboratories depend on, it is advisable to establish and setup a booking system that helps to administer access to them. This paper reports on the architecture and conceptual model of the rig booking system designed for the LiLa Portal, a web portal that makes virtual and remote experiments available on the Internet. The design of the booking system is based on a requirements analysis carried out by the EC funded LiLa project in cooperation with international partners from the Global Online Lab Consortium, GoLC.

Andreas Menychtas, Dimosthenis Kyriazis, Spyridon V. Gogouvitis, Karsten Oberle, Thomas Voith, Georgina Gallizo, Sören Berger, Eduardo Oliveros, and Mike J. Boniface. A cloud platform for real-time interactive applications. In Frank Leymann, Ivan Ivanov, Marten van Sinderen, and Boris Shishkov, editors, CLOSER, pages 397-403. SciTePress, 2011. [ bib | external link www ]

2010

Thomas Richter. On the duality of rate allocation and quality indices. In Picture Coding Symposium (PCS), pages 270-273, Nagoya, Japan, December 2010. IEEE. [ bib | external link DOI | external link www ]

In a recent work, the author proposed to study the performance of still image quality indices such as the SSIM by using them as objective function of rate allocation algorithms. The outcome of that work was not only a multi-scale SSIM optimal JPEG 2000 implementation, but also a first-order approximation of the MS-SSIM that is surprisingly similar to more traditional contrast-sensitivity and visual masking based approaches. It will be seen in this work that the only difference between the latter works and the MS-SSIM index is the choice of the exponent of the masking term, and furthermore, that a slight modification of the SSIM definition reproducing the traditional exponent is able to improve the performance of the index at or below the visual threshold. It is hence demonstrated that the duality of quality indices and rate allocation helps to improve both the visual performance of the compression codec and the performance of the index.

Thomas Richter. Rate Allocation as Quality Index Performance Test. In Applications of Digital Image Processing XXXIII, volume 7798, San Diego, CA, September 2010. SPIE, SPIE. [ bib | external link DOI | external link www ]

In a recent work, the author proposed to study the performance of still image quality indices such as the SSIM by using it as objective function of a rate allocation algorithm. The outcome of that work was not only a multi-scale SSIM optimal JPEG 2000 implementation, but also a first-order approximation of the MS-SSIM that is surprisingly similar to more traditional contrast-sensitivity and visual masking based approaches. It will be seen in this work that the only difference between the latter works and the MS-SSIM index is the choice of the exponent of the masking term, and furthermore, that a slight modification of the SSIM definition that reproduces more traditional exponents is able to improve the correlation with subjective tests and also improves the performance of the SSIM optimized JPEG 2000 code. That is, understanding the duality of quality indices and rate allocation helps to improve both the visual performance and the performance of the index.

Kil Joong Kim, Bohyoung Kim, R. Mantiuk, T. Richter, Hyunna Lee, Heung-Sik Kang, Jinwook Seo, and Kyoung Ho Lee. A Comparison of Three Image Fidelity Metrics of Different Computational Principles for JPEG2000 Compressed Abdomen CT Images. Medical Imaging, IEEE Transactions on, 29(8):1496-1503, August 2010. [ bib | external link DOI | external link www ]

This study aimed to evaluate three image fidelity metrics of different computational principles-peak signal-to-noise ratio (PSNR), high-dynamic range visual difference predictor (HDR-VDP), and multiscale structural similarity (MS-SSIM)-in measuring the fidelity of JPEG2000 compressed abdomen computed tomography images from a viewpoint of visually lossless compression. Three hundred images with 0.67- or 5-mm section thickness were compressed to one of five compression ratios ranging from reversible compression to 15:1. The fidelity of each compressed image was measured by five radiologists' visual analyses (distinguishable or indistinguishable from the original) and the three metrics. The Spearman rank correlation coefficients of the PSNR, HDR-VDP, and MS-SSIM values with the number of readers responding as indistinguishable were 0.86, 0.94, and 0.86, respectively. Using the pooled readers' responses as the reference standard, the area under the receiver-operating-characteristic curve for the HDR-VDP (0.99) was significantly greater than that for the PSNR (0.95) (p <; 0.001) and for the MS-SSIM (0.96) (p = 0.003), and there was no significant difference between the PSNR and MS-SSIM (p = 0.70). In measuring the image fidelity, the HDR-VDP outperforms the PSNR and MS-SSIM, and the MS-SSIM and PSNR are comparable.

Barbara Burr, Peter Göhner, Wolfram Ressel, Wolfgang Schlicht, and Sabina Jeschke. Spirit: University of Stuttgart's life-cycle-based gender-mainstreaming-concept. In Education Engineering (EDUCON), 2010 IEEE, Education Engineering (EDUCON), 2010 IEEE, pages 853-860, Madrid, Spain, April 2010. IEEE. [ bib | external link DOI ]

In spite of social and political efforts to achieve equal opportunities, women remain a minority in natural sciences, technical and related fields. We hereby present the gender concept of the University of Stuttgart. First, the steps for promotion of female students within natural sciences and technical fields are developed.

Thomas Richter, Yvonne Tetour, and David Boehringer. Simulations in Undergraduate Electrodynamics: Virtual Laboratory Experiments on the Wave Equation and their Deployment. In Education Engineering (EDUCON), 2010 IEEE, Education Engineering (EDUCON), 2010 IEEE, pages 1091-1097, Madrid, Spain, April 2010. IEEE. [ bib | external link DOI | external link www ]

Experiments play a vital role in undergraduate engineering education: They allow students to learn the foundations of engineering in practical hands-on courses. However, lack of funding and increasing costs for equipment makes it harder and harder to supply a complete pool of experiments for large student classes. The EU funded “Library of Labs” project aims to counterbalance this development by creating a EU wide network of remotely controlled experiments and virtual laboratories. Remote experiments are here real experiments remotely controlled over a network, virtual laboratories simulation environments using the component metaphor of a real laboratories. In this paper, we introduce such a virtual laboratory developed at the University of Stuttgart; the aim here is to help students, here participating in the undergraduate physics course for engineers, understanding abstract phenomena by visualizing the underlying mathematics. We demonstrate this in a particular use-case, the wave equation and phenomena related to it, as they are discussed in undergraduate physics, and show how to implement this as a simulation in the virtual laboratory. In cooperation with the physics department a deployment plan for this experiment and related experiments has been created for the lecture “Physics for Engineering” which shall also be presented and discussed.

Thomas Richter. Spatial Constant Quantization in JPEG XR is Nearly Optimal. In Data Compression Conference (DCC), 2010, pages 79-88, Snowbird, UT, March 2010. IEEE. [ bib | external link DOI | external link www ]

The JPEG XR image compression standard, originally developed under the name HD-Photo by Microsoft, offers the feature of spatial variably quantization; its codestream syntax allows to select one out of a limited set of possible quantizers per macro block and per frequency band. In this paper, an algorithm is presented that finds the rate-distortion optimal set of quantizers, and the optimal quantizer choice for each macro block. Even though it seems plausible that this feature may provide a huge improvement for images whose statistics is non-stationary, e.g. compound images, it is demonstrated that the PSNR improvement is not larger than 0.3dB for a two-step heuristics of feasible complexity, but improvements of up to 0.8dB for compound images are possible by a much more complex optimization strategy.

Barbara Burr, Thomas Richter, Sabina Jeschke, Nicole Natho, and Olivier Pfeiffer. Virtual and Remote Laboratories in Distance Education. In 3rd Annual Forum on e-learning Excellence in the Middle East, Dubai, UAE, February 2010. [ bib ]

Practical exercises in the laboratory form a cornerstone of academic education in engineering and natural sciences. Besides experiment and theory nowadays, simulations become increasingly relevant. In particular in distance education, due to the absence of in-class teaching a comprehensive IT infrastructure is necessary, allowing for the mutual use of a broad variety of remote controllable experimental set ups and virtual laboratories through students and professors. So far, such solutions have mostly been limited to individual universities and research institutions. Since these are yet by construction available over the Internet, and thus independent of the location, the challenge is to build a comprehensive web-based portal infrastructure for experimental set ups under open source/access/content policy. In this article we discuss the underlying architecture, give some examples of typical components and highlight our background and motivation.

David J. Lutz and Burkhard Stiller. Combining identity federation with payment: The SAML-based payment protocol. In IEEE/IFIP Network Operations and Management Symposium, NOMS 2010, 19-23 April 2010, Osaka, Japan, pages 495-502, 2010. [ bib | external link DOI | external link www ]

David J. Lutz, Dominik Lamp, Patrick Mandic, Fabio Victora Hecht, and Burkhard Stiller. Charging of SAML-based federated VoIP services. In Proceedings of the 5th International Conference for Internet Technology and Secured Transactions, ICITST 2010, London, United Kingdom, November 8-10, 2010, pages 1-8, 2010. [ bib | external link www ]

David J. Lutz, Dominik Lamp, Patrick Mandic, Fabio Victora Hecht, and Burkhard Stiller. Charging of saml-based federated voip services. pages 1-8, 2010. [ bib | external link www ]

D.Kyriazis, R.Einhorn, L.Fürst, M.Braitmaier, D.Lamp, K.Konstanteli, G.Kousiouris, A.Menychtas, E.Oliveros, N.Loughran, and B.Nasser. A methodology for engineering real-time interactive multimedia applications on service oriented infrastructure. In in IADIS International Conference Applied Computing 2010, 2010. [ bib ]

Dimosthenis Kyriazis, Andreas Menychtas, Karsten Oberle, Thomas Voith, Michael Boniface, Eduardo Oliveros, Tommaso Cucinotta, and Sören Berger. A real-time service oriented infrastructure, 2010. [ bib ]

T. Cucinotta, F. Checconi, G. Kousiouris, D. Kyriazis, T. Varvarigou, A. Mazzetti, Z. Zlatev, J. Papay, M. Boniface, S. Berger, D. Lamp, T. Voith, and M. Stein. Virtualised e-learning with real-time guarantees on the irmos platform. In Service-Oriented Computing and Applications (SOCA), 2010 IEEE International Conference on, pages 1-8, 2010. [ bib | external link DOI ]

2009

S. Jeschke, E. Hauck, O. Pfeiffer, and T. Richter. Robotics as demonstrator discipline for networked virtual and remote laboratories. In Advanced Robotics and its Social Impacts (ARSO), 2009 IEEE Workshop on, pages 109-113, November 2009. [ bib | external link DOI | external link www ]

Based on the BW-eLabs platform, the goal of the NETLABS is the development of a ubiquitous software portal, allowing researchers to move backward and forward between interlinked experimental superstructures and simulations in a 3D knowledge space. Robotics - as one of the most important key technologies of the 21st century - is in a special need of “virtual infrastructure support” because of its cost-intenseness. The environments where these systems can be examined in both a simulative as well as a realistic manner are manifold. Due to the accessibility of the different robotics applications, new trials can be tested faster and at a reasonable price. In NetLabs, a role- and rights-based model is developed, allowing access to experiments or measured data. The necessary components are integrated in the 3D Wonderland engine.

Thomas Richter, Bernard Brower, Stephen Martucci, and Alexis Tzannes. Interoperability in JPIP and its Standardization in JPEG 2000 Part 9. In SPIE Applications of Digital Image Processing XXXII, volume 7443. SPIE, September 2009. [ bib | external link DOI | external link www ]

The ISO standard JPEG 2000 Part 9 (15444-9) specifies a versatile and flexible image browsing and delivering protocol that allows the interactive selection of regions of large images and their transmission over a narrow bandwidth connection. However, due to the enormous flexibility, achieving interoperability between software from differing vendors is not an easy task. To address this challenge, the JPEG committee started an initiative in the form of an amendment to 15444-9 to establish common grounds on which interoperability can be defined. The outcome of this work are recommendations which subsets of JPIP vendors should focus on, hopefully easing the adoption of JPIP by identifying the options the committee found in widespread use. In this paper, the design and evolution of JPIP interoperability will be discussed, the grounds on which interoperability can be achieved- variants and profiles- will be introduced, and their design will be motivated. The paper closes with an outlook how to extend this amendment for future applications.

Thomas Richter. Evaluation of Floating Point Image Compression. In QoMEX 2009, First International Workshop on Quality of Multimedia Experience, pages 222-227, San Diego, CA, USA, July 2009. IEEE. [ bib | external link DOI | external link www ]

Recently, compression of High Dynamic Range (HDR) photography gained attention in the standardization of the Microsoft HDPhoto compression scheme as JPEG-XR. While integer data of 16 bits/pixel (bpp) in scRGB color-space can represent images up to a dynamic range of about 3.5 magnitudes in luminance - a noteworthy improvement over the 1.6 magnitudes possible in sRGB - even higher ranges are more efficiently represented by floating-point number formats. However, traditional means to evaluate image quality are rarely suitable for such data: They are often only calibrated to low-dynamic ranges (LDR) of 8bpp, and are not designed to take the peculiarities of floating-point data into account. In this work, we present two approaches to deal with this problem by introducing a (mathematical) quality index more suitable to floating point data related to SSIM [?], and by presenting an independent image quality evaluation framework that is able to apply LDR metrics to HDR data. The presented ideas are then tested on the HDPhoto floating point compression, on a proprietary backwards compatible extension of JPEG [?] and on a proposed floating point compression scheme based on JPEG 2000 [?] that is also proven to be optimal in the proposed quality index. It is then shown that both approaches, the proposed metric and PSNR in the LDR domain, deliver comparable results.

Thomas Richter. On the mDCT-PSNR Image Quality Index. In QoMEX 2009, First International Workshop on Quality of Multimedia Experience, pages 53-58, San Diego, CA, USA, July 2009. IEEE. [ bib | external link DOI | external link www ]

Full reference image quality indices assign a quality index to a pair of an undistorted reference image and a distorted image to be assessed; the quality of the index itself is then defined by its ability to predict the outcome of subjective tests performed by human observers judging the quality of the same image pair. In this article, a new DeT based image quality index is introduced whose complexity is between that of algorithms like SSIM and complex, HVS-based algorithms like VDP. Detailed experiments on the LIVE database show that the proposed algorithm performs best close to or below the visual threshold, and outperforms there existing algorithms like SSIM or VDP requiring only a mediocre complexity.

Thomas Richter, David Boehringer, and Sabina Jeschke. LiLa: A European Project on Networked Experiments. In 6th International Conference on Remote Engineering and Virtual Instrumentation, Bridgeport,CT,USA, June 2009. REV, Kassel University Press. [ bib ]

The LiLa project - short for “Library of Labs” - is a European Community funded project to network remote experiments and virtual laboratories. The goal of this project is the composition and dissemination of a European infrastructure for mutual exchange of experimental setups and simulations, specifically targeted at undergraduate studies in engineering and science. This article discusses the architecture of the project, introduces its components and sheds some light on our motivation and background.

David Boehringer, Barbara Burr, Sabina Jeschke, and Thomas Richter. Networking Virtual and Remote Experiments in LiLa. In Proceedings of the 2009 World Conference on Educational Multimedia, Hypermedia {{{&bib ]

Thomas Richter. Perceptual Image Coding by Standard-Constraint Codecs. In Picture Coding Symposium, 2009. PCS 2009, pages 1-4, Chicago, IL, USA, May 2009. [ bib | external link DOI | external link www ]

A perceptual image compression codec exploits the characteristics of the human senses to minimize the perceivable quality loss of digital images under compression. Such a codec has an even higher value if the resulting codestreams are compatible to an existing standard, and are thus decodable by all-day, existing applications. This work will first describe three basic mechanisms perceptual coding is based on today, followed by strategies how to implement them in standardized environments, namely JPEG, JPEG 2000 and JPEG-XR. Following that, strategies to evaluate the success of perceptive coding are discussed, namely subjective measurements and objective quality metrics. Finally, the circle is closed back to compression codecs by showing on the example of JPEG 2000 and SSIM that quality metrics can also be used to drive the rate-allocation of a codec, and hence explore the quality judgment of a metric directly.

Sabina Jeschke, Barbara Burr, Jens U. Hahn, Leni Helmes, Walter Kriha, Michael Krüger, Andreas W. Liehr, Wolfgang Osten, Olivier Pfeiffer, Thomas Richter, Gerhard Schneider, Werner Stephan, and Karl-Heinz Weber. BW-eLabs - Knowledge Management in Virtual and Remote Labs. In 4th International Workshop on Distributed Cooperative Laboratories: Instrumenting the Grid, Alghero, Sardinia, Italy, April 2009. [ bib ]

The architecture of the networked virtual laboratories and remote experiments in Baden-Württemberg (BW-eLabs) is aiming at the extension of the access to diverse remote and virtual experimental resources as well as effective and enduring storage, indexing and (re-)use of experimental raw data for academic and scientific objectives. Following an open access policy, the BW-eLabs understands itself as an open network for scientific data and experimental set-ups. In this context, the facilitation of external access to local experimental set-ups and, at the same, time safeguarding reproducibility and transparency of the experiments are vital requirements. Thus, one of the features of the BW-eLabs is that documents and data are being monitored along their whole life cycle time and embedded into a process chain of experimental environments. Hence, the incorporation of existing infrastructures, e.g. decentralized repositories and tools, and digital libraries, in the virtual 3D- plattform of the BW-eLabs is another principal point. (Remote) access to experimental equipment is an important prerequisite to ensure all scientific communities involved can use the appropriate professional tools. The outstanding cost-intensiveness of experimental equipment in this area, recommends nanotechnology to serve as a demonstrator discipline. Advancement of cooperation and collaboration in scientific communities in high-tech fields takes centre stage in this notion.

S. Jeschke, O. Pfeiffer, and T. Richter. VideoEasel - A flexible programmable simulation environment for discrete many body systems. In GCC Conference Exhibition, 2009 5th IEEE, pages 1-5, March 2009. [ bib | external link DOI | external link www ]

In this work, we present a Virtual Laboratory providing a simulation framework for discrete many-body systems. Programs defining the dynamics of the system and instruments measuring on the simulation can be easily implemented within its own programming language, and can be linked and edited at run time. The system class that can be covered within this framework reaches from discrete difference equations over classical many-body problems is physics to research problems in image processing, allowing us to apply this laboratory in education and research.

Thomas Richter and Kil Joong Kim. A MS-SSIM Optimal JPEG 2000 Encoder. In Data Compression Conference, pages 401-410, Snowbird, UT, USA, March 2009. IEEE. [ bib | external link DOI | external link www ]

S. Steger and T. Richter. Universal Refinable Trellis Coded Quantization. In Data Compression Conference, 2009. DCC '09., pages 312-321, March 2009. [ bib | external link DOI | external link www ]

We introduce a novel universal refinable trellis quantization scheme (URTCQ) that is suitable for bitplane coding with many reconstruction stages. Existing refinable trellis quantizers either require excessive codebook training and are outperformed by scalar quantization for more than two stages (MS-TCQ, E-TCQ), require a huge computational burden (SR-TCQ) or achieve a good rate distortion performance in the last stage only (UTCQ). The presented quantization technique is a mixture of a scalar quantizer and an improved version of the E-TCQ. For all supported sources only one time training to an i.i.d. uniform source is required and its incremental bitrate is not more than 1 bps for each stage. The complexity is proportional to the number of stages and the number of trellis states. We compare the rate distortion performance of our work on generalized Gaussian i.i.d. sources with the quantizers deployed in JPEG2000 (USDZQ, UTCQ). It turns out that it is in no stage worse than the scalar quantizer and usually outperforms the UTCQ except for the last stage.

David Lutz and Burkhard Stiller. Applied Federation Technology: The Charging of Roaming Students. Terena Networking Conference 2009, 2009. [ bib ]

Thomas Richter, David Boehringer, and Sabina Jeschke. LiLa: Ein Europäisches Projekt zur Vernetzung von Experimenten. In E-Learning 2009. Lernen im digitalen Zeitalter, Münster, 2009. Waxmann Verlag GmbH. [ bib | external link www ]

LiLa - kurz für Library of Labs - ist ein von der EU im Rahmen des eContentplus-Programmes gefördertes Projekt zur Vernetzung von fernsteuerbaren Experimenten und virtuellen Laboren. Ziel des Projektes ist der Aufbau einer europaweiten Infrastruktur zur gegenseitigen Nutzung von Experimentalaufbauten und Simulationssoftware zur Verbesserung der Lehre im Grund- bzw. Bachelorstudium der ingenieur- und naturwissenschaftlichen Studienfächer. In diesem Artikel besprechen (... die Autoren) die Architektur des Projektes, geben einige Beispiele für typische Komponenten und beleuchten die Hintergründe und ihre Motivation. (DIPF/ Orig.)

2008

Frances Cleary, Antonio Romero, Jürgen Jähnert, and Yongzheng Liang. Daidalos II: Implementing a Scenario Driven Process. Third International Conference on Software Engineering Advances, October 2008. [ bib ]

Sabina Jeschke, Thomas Richter, and Uwe Sinha. Embedding Virtual and Remote Experiments Into a Cooperative Knowledge Space. In The 2008 Frontiers in Education Conference (FIE 2008), Saratoga, NY, October 2008. IEEE. [ bib | external link www ]

Today, experimental environments and setups in natural sciences and engineering are neither available sufficiently, nor accessible enough to cover the broad demand. Yet, they form an essential part of the scientific methodology within the technological disciplines. Additionally, the ability to cooperate and work in teams when performing experiments is crucial. By integrating experimental setups into a virtual cooperative knowledge space, availability and accessibility can be enhanced for a wide range of people, working individually or in groups, making them independent of limitations in time, budget or access to classical laboratories. This article describes a SOAP-based architecture by which this objective can be achieved, and which is currently being implemented for CURE, a room-based cooperative knowledge space platform developed at FernUniversitaet Hagen.

Thomas Richter. Effective Visual Masking Techniques in JPEG2000. In Image Processing, 2008. ICIP 2008. 15th IEEE International Conference on, pages 2876-2879, San Diego, CA, October 2008. [ bib | external link DOI | external link www ]

This paper introduces a very low complexity visual masking algorithm for the JPEG2000 image compression standard and evaluates its impact on the visual image quality by means of the Multi-scale SSIM index. The algorithm derives suitable weighting masks indirectly from a statistical model of the wavelet data which is defined from the second moment and the average absolute amplitude of the data. If combined with an a priori rate allocation algorithm, the computation of the visual masking weights has almost no overhead at all.

Thomas Richter. Visual Quality Improvement Techniques of HDPhoto/JPEG-XR. In Image Processing, 2008. ICIP 2008. 15th IEEE International Conference on, pages 2888-2891, San Diego, CA, October 2008. [ bib | external link DOI | external link www ]

Microsoft's recently proposed new image compression codec HDPhoto is currently undergoing ISO standardization as JPEG-XR. Even though performance measurements carried out by the JPEG committee indicated that the PSNR performance of HDPhoto is competitive, the visual performance of HDPhoto showed notable deficits, both in subjective and objective tests. This paper introduces various techniques that improve the visual performance of HDPhoto without leaving the current codestream definition. Objective measurements performed by the author indicate that the modified encoder, while staying backwards compatible to the current standard proposition, improves visual performance significantly, and the performance of the modified encoder is similar to JPEG.

Patrick Mandic and Jürgen Jähnert. Service-oriented network selection. Proceedings on IEEE International Symposium on Wireless Communication Systems, pages 138 - 143, October 2008. [ bib ]

Sabina Jeschke, Thomas Richter, Thomas Isele, and Olivier Pfeiffer. Algorithms on Graphs: Automatic Course Verification in eLearning. In The 10th IASTED International Conference on Signal and Image Processing  IMSA 2008 , Kailua, Hawaii, August 2008. [ bib ]

Intelligent course management and training applications for students design eLearning courses as storyboard graphs whose nodes are elementary training units and whose edges encode dependencies between them. A well-designed storyboard is then able to adapt the course to learner by observing the history of the learning path and propose suitable course elements to the student. However, setting up a storyboard and testing it for correctness and completeness is a tedious task that can be well taken over by the computer, too. Since storyboards are mathematically described as graphs, known algorithms on graphs are readily deployed here and help authors to setup consistent courses.

In this article, we introduce our course management “Marvin”, describe its properties in an eLearning framework designed to run interactive experiments, so called “Virtual Labs”, and introduce the structure of its courses. We then define the terms correctness and completeness of a course within the system and describe algorithms that help authors to test for these properties.

Thomas Richter and Chaker Larabi. Towads Objective Image Quality Metrics: The AIC Eval Program of the JPEG. In Andy Tescher, editor, Applications of Digital Image Processing XXXI, San Diego, CA, August 2008. SPIE, SPIE. [ bib ]

Objective quality assessment of lossy image compression codecs is an important part of the recent call of the JPEG for Advanced Image Coding. The target of the AIC ad-hoc group is twofold: First, to receive state-of-the-art still image codecs and to propose suitable technology for standardization; and second, to study objective image quality metrics to evaluate the performance of such codes. Even tthough the performance of an objective metric is defined by how well it predicts the outcome of a subjective assessment, one can also study the usefulness of a metric in a non-traditional way indirectly, namely by measuring the subjective quality improvement of a codec that has been optimized for a specific objective metric. This approach shall be demonstrated here on the recently proposed HDPhoto format[?] introduced by Microsoft and a SSIM-tuned[?] version of it by one of the authors. We compare these two implementations with JPEG[?] in two variations and a visual and PSNR optimal JPEG2000[?] implementation. To this end, we use subjective and objective tests based on the multiscale SSIM and a new DCT based metric.

Sabina Jeschke, Thomas Richter, Olivier Pfeiffer, Harald Scheel, and Christian Thomsen. Selected Aspects of Magnetism in Virtual Laboratories and Remote Experiments. In 9th Nordic Research Symposium on Science Education, Reykjavik, Iceland, June 2008. [ bib ]

The science of physics is built on theories and models as well as on experiments: the former structure relations and simplify reality to a degree such that predictions on physical phenomena can be derived by means of mathematics. The latter allow verifying- or falsifying- these predictions. Computer sciences allow a new access to this relationship, especially well-suited for education: New Media and New Technologies provide simulations for the model, virtual instruments for running and evaluating real experiments and mathematical toolkits to solve equations derived from the theory analytically and to compare the outcome of all three methods. We will demonstrate this approach on two examples: Ferro-magnetism and elementary thermodynamics.

Sabina Jeschke, Olivier Pfeiffer, and Thomas Richter. User Adaptive Interactive Courses in SCORM Compliant Learning Management Systems. In Conference IMCL 2008 Amman, Amman/Jordan, April 2008. [ bib ]

Traditional on-line courses are static: Unaware of the learner, they present the same content to every user that participates in the course, independent of the background and the experience of the learner. Furthermore, content is often static and leaves little freedom to the learner. One might argue that this is because currently applied standards like SCORM 1.2 do not allow much more than static content linked statically within the learning management system. However, while the upcoming SCORM 2004 addresses adaptivity at the level of the learning management system, we present a class of dynamic interactive content objects in this paper that provide adaptivity at the level of the learning objects themselves while also leaving lots of freedom to the learner. Since the data mining required for adaptivity happens outside of the learning management system, the presented learning objects already provide their full functionality within SCORM 1.2.

Sabina Jeschke, Thomas Richter, and Olivier Pfeiffer. User Adaptive Interactive Courses in SCORM Compliant Learning Management Systems. In International Conference on Technology Communication and Education, Al Kuwayt, Kuwayt, April 2008. [ bib ]

Traditional on-line courses are static: Unaware of the learner, they present the same content to every user that participates in the course, independent of the background and the experience of the learner. Furthermore, content is often static and leaves little freedom to the learner. One might argue that this is because currently applied standards like SCORM 1.2 do not allow much more than static content linked statically within the learning management system. However, while the upcoming SCORM 2004 addresses adaptivity at the level of the learning management system, we present a class of dynamic interactive content objects in this paper that provide adaptivity at the level of the learning objects themselves while also leaving lots of freedom to the learner. Since the data mining required for adaptivity happens outside of the learning management system, the presented learning objects already provide their full functionality within SCORM 1.2.

T. Richter. Effective Visual Masking Techniques in JPEG2000. In Data Compression Conference, 2008. DCC 2008, pages 540-540, March 2008. [ bib | external link DOI | external link www ]

Rate allocation in the JPEG2000 image compression algorithm is performed by the EBCOT algorithm, measures file size and distortion, defined as mean square error (MSE). Since MSE correlates only mediocre to visual quality, more advanced metrics like the M-SSIM have been proposed. One exploitable effect of the human visual system is that of visual masking: If a structure of a fixed amplitude is overlayed by a texture, it becomes masked and less visible. This can be addressed in JPEG2000 by multiplying the MSE contribution of a codeblock by a factor mu computed from the neighbourhood of the data. Most of these techniques require, however, complex operations on the coefficients.

David Lutz and Burkhard Stiller. Token-based Payment in Dynamic SAML-based Federations. Resilent Netwoks and Services, AIMS 2008, 2008. [ bib ]

David Lutz, Patrick Mandic, Sascha Neinert, Ruth del Campo, and Jürgen Jähnert. Harmonizing Service and Network Provisioning for Federative Access in a Mobile Environment. NOMS 2008, IEEE/IFIP Network Operations & Management Symposium, 2008. [ bib ]

David Lutz. Cash Tokens for SAML based Federations. Enterprise Architecture and Services in the Finance Industry, FinanceCom 2007, 2008. [ bib ]

Jürgen Jähnert, Renato Roque, and Vasilis Kaldanis. Federation in Daidalos. ICT-MobileSummit 2008, 2008. [ bib ]

Jürgen Jähnert. Model Driven Testing. Testing & Quality Day, 2008. [ bib ]

Jürgen Jähnert. Commercialising the Mobile Internet from a Metering and Accounting Perspective. IADIS International Conference e-Society, 2008. [ bib ]

2007

Sven Grottke, Sabina Jeschke, and Thomas Richter. An Integrated Course on Wavelet-Based Image Compression - Learning Abstract Information Theory on Visual Data. In CISSE 2007, December 2007. [ bib ]

We describe the implementation of and our experiences with a capstone course on wavelet based image compression held at the Berlin University of Technology in the years 2002 to 2006. This course has been designed as an “integrated project”, which means that it combines lectures, seminar talks to be prepared and held by the students, and a programming part. The design goal of this course has been to provide all the necessary theoretical knowledge to understand the concepts behind image compression technologies, such as JPEG2000. We are also aiming at simulating the work-flow as found within an IT company as realistically as possible, preparing electrical engineers and computer scientists as well as possible for their professional life. This training does not only include the technical, but also the social skills required to successfully complete larger projects. The subject of image compression offers the advantage of requiring a solid knowledge on terms of information science such as entropy, distortion, quantization, Fourier and wavelet-transformation, but also offering a direct visual feedback of how these techniques perform. Therefore, we believe that image compression is an attractive topic to be used for a capstone course. Traditionally, a course would assign weekly programming exercises to the students; however, we believe this to be unsuitable for a capstone course as it does not simulate the work-flow of a professional software development team; furthermore, it does not require the degree of team-work we deem critical to modern software development. Thus, we divide students into groups of two to four people and assign each team to one sub-task of an image codec and provide some boiler-plate code of our own. Much to their astonishment, students soon find themselves spending a considerable amount of time with project management and coordination activities. That means, teams have to design interfaces and data structures to combine their efforts to create a working project, which adds an often underestimated social component to the course. With some guidance from the teachers, students have always been able to supply a working code at the end of the semester. Needless to say, the thrill of having a nontrivial working program at the end of the course is a major source of motivation for our students and adds much to the satisfaction and positive feedback we receive.

Sabina Jeschke, Olivier Pfeiffer, Thomas Richter, Harald Scheel, and Christian Thomsen. On Remote and Virtual Experiments in eLearning in Statistical Mechanics and Thermodynamics. In 6th Annual ASEE Global Colloquium on Engineering Education, Istanbul, October 2007. [ bib ]

The science of physics is built on theories and models as well as on experiments. Theories and models structure relations and simplify reality to such a degree that predictions on physical phenomena can be derived by means of mathematics. Experiments allow to verify - or falsify - those predictions. Computer sciences allow a new access to this relationship which is especially well-suited for education. New Media and New Technologies provide simulations for the model, virtual instruments for running and evaluating real experiments and mathematical toolkits to solve equations derived from the theory analytically and to compare the outcome of all three methods. We will demonstrate this approach on two examples: Ferromagnetism and elementary thermodynamics.

Sabina Jeschke, Thomas Richter, Christian Thomsen, and Harald Scheel. On Remote and Virtual Experiments in eLearning. JSW, September 2007. [ bib ]

The science of physics is based on theories and models as well as experiments: the former structure relations and simplify reality to a degree such that predictions on physical phenomena can be derived by means of math- ematics. The latter allow verification or falsification of these predictions. Computer sciences allow a new access to this relationship, especially well-suited for education: New Technologies provide simulations for the model, virtual instruments for running and evaluating real experiments and mathematical toolkits to solve equations derived from the theory analytically and to compare the outcome of all three methods. We will demonstrate this approach on several examples: Ferromagnetism, thermodynamics and the Harmonic Oszillator. We furthermore give a brief example on an online-tutoring system that makes our setup attractive for self-study outside the university campus.

Jürgen Jähnert. Verteilte Nutzungsdatenverarbeitung und nachgelagerte Weiterverarbeitung der Nutzungsdaten im Mobilen Internet. Dissertationsschrift, Institut für Kommunikationsnetze und Rechnersysteme, Universität Stuttgart, June 2007. [ bib ]

Sven Grottke and Thomas Richter. Learning Abstract Information Theory on Visual Data: An Integrated Course on Wavelet-Based Image Compression. In 2007 ASEE annual conference, Honolulu, Hawaii, June 2007. [ bib ]

We describe the implementation of and our experiences with a capstone course on wavelet based image compression held at the University of Technology Berlin in the years 2002 to 2006. This course has been designed as an “integrated project”, which means that it combines lectures, seminar talks to be prepared and held by the students, and a programming part.

The design goal of this course has been to provide all the necessary theoretical knowledge to understand the concepts behind image compression technologies, such as JPEG2000. We are also aiming at simulating the work-flow as found within an IT company as realistically as possible, preparing electrical engineers and computer scientists as well as possible for their professional life. This training does not only include the technical, but also the social skills required to successfully complete larger projects.

The subject of image compression offers the advantage of requiring a solid knowledge on terms of information science such as entropy, distortion, quantization, Fourier and wavelet-transformation, but also offering a direct visual feedback of how these techniques perform. Therefore, we believe that image compression is an attractive topic to be used for a capstone course.

Traditionally, a course would assign weekly programming exercises to the students; however, we believe this to be unsuitable for a capstone course as it does not simulate the work-flow of a professional software development team; furthermore, it does not require the degree of team-work we deem critical to modern software development. Thus, we divide students into groups of two to four people and assign each team to one sub-task of an image codec and provide some boiler-plate code of our own. Much to their astonishment, students soon find themselves spending a considerable amount of time with project management and coordination activities. That means, teams have to design interfaces and data structures to combine their efforts to create a working project, which adds an often underestimated social component to the course. With some guidance from the teachers, students have always been able to supply a working code at the end of the semester. Needless to say, the thrill of having a nontrivial working program at the end of the course is a major source of motivation for our students and adds much to the satisfaction and positive feedback we receive.

Sabina Jeschke, Olivier Pfeiffer, Thomas Richter, Harald Scheel, and Christian Thomsen. On Remote and Virtual Experiments in eLearning in Statistical Mechanics and Thermodynamics. In 2007 ASEE annual conference, Honolulu, Hawaii, June 2007. [ bib ]

The science of physics is built on theories and models as well as on experiments: the former structure relations and simplify reality to a degree such that predictions on physical phenomena can be derived by means of mathematics. The latter allow to verify - or falsify - these predic- tions. Computer sciences allow a new access to this relationship, especially well-suited for edu- cation: New Media and New Technologies provide simulations for the model, virtual instruments for running and evaluating real experiments and mathematical toolkits to solve equations derived from the theory analytically and to compare the outcome of all three methods. We will demon- strate this approach on two examples: Ferro-magnetism and elementary thermodynamics.

Antonio Cuevas, Jürgen Jähnert, Jose I. Moreno, Victor A. Villagra, Vicente Olmedo, and Stefan Wesner. The Akogrimo Service Provisioning Platform. Proceedings of the 7th International Workshop on Applications and Services in Wireless Networks, 2007, May 2007. [ bib ]

Sabina Jeschke and Thomas Richter. Selected Aspects of Networked Experiments in Cooperative Knowledge Spaces. In 2nd International Conference on Engineering Education & Training (ICEET-2), Kuwait, April 2007. [ bib ]

Virtual labs enable field specific experiments and open them for collaborative and distributed usage. In order to realize comprehensive laboratory set-ups provid- ing a scientific broadness and user adaptivity, several challenges regarding the integration of different software technologies have to be solved. We propose an eLearning framework consisting of a learner and a course model; exercises within this framework are supplemented by virtual laboratories and computer algebra systems. We discuss the potentials of this setup on the example of the laboratory VideoEasel and its interface to Maple.

Thomas Richter, Sven Grottke, and Ruedi Seiler. Faster JPEG2000 Encoding With Apriori Rate Allocation. In International MultiConference of Engineers and Computer Scientists 2007, HongKong, April 2007. [ bib ]

Sabina Jeschke, Thomas Richter, Christian Thomsen, and Harald Scheel. On Remote and Virtual Experiments in eLearning in Statistical Mechanics and Thermodynamics. In Pervasive Computing and Communications Workshops, 2007. PerCom Workshops '07. Fifth Annual IEEE International Conference on, New York, March 2007. [ bib | external link DOI | external link www ]

The science of physics is based on theories and models as well as experiments: the former structure relations and simplify reality to a degree such that predictions on physical phenomena can be derived by means of math- ematics. The latter allow verification or falsification of these predictions. Computer sciences allow a new access to this relationship, especially well-suited for education: New Technologies provide simulations for the model, virtual instruments for running and evaluating real experiments and mathematical toolkits to solve equations derived from the theory analytically and to compare the outcome of all three methods. We will demonstrate this approach on two examples: Ferromagnetism and thermodynamics.

David Lutz. Web2.0 for Machines and Services: Human Oriented Service Identity Management. Innovations 2007, 4th International Conference on Innovations in Information Technology, 2007. [ bib ]

David Lutz. Using Neural Gas for a Better Machine Identity Description. ASC 2007, 11th IASTED International Conference on Artificial Intelligence and Soft Computing, 2007. [ bib ]

David Lutz. Federation Payments using SAML Tokens with Trusted Platform Modules. ISCC'07, IEEE Symposium on Computers and Communications, 2007. [ bib ]

David Lutz. Secure AAA by means of Identity Tokens in Next Generation Mobile Environments. ICWMC 07, Third International Conference on Wireless and Mobile Communications, 2007. [ bib ]

Jürgen Jähnert, Antonio Cuevas, Jose I. Moreno, Victor A. Villagra, Stefan Wesner, Vicente Olmedo, and Hans Einsiedler. The Akogrimo way towards an extended IMS architecture. 11th International Conference on Intelligence in Service Delivery Networks, Borderaux, 2007. [ bib ]

M. Waldburger, C. Morariu, P. Racz, J. Jähnert, S. Wesner, and B. Stiller. Grids in a Mobile World: Akogrimos Network and Business Views. Praxis der Informationsverarbeitung und Kommunikation (PIK), Vol. 30, No. 1, pages 32-43, January 2007. [ bib ]

David Boehringer. Bridging the gaps: A community server for the connection of different learning management systems. In Proc. Online Educa, pages 393-394, Berlin, 2007. [ bib ]

Tom Kirkham, David Lutz, Jesus Movilla, Patrick Mandic, Julian Gallop, and Cristian Morariu. Identity Management in a Mobile Grid Environment. UK e-Science All Hands Meeting 2007, 2007. [ bib ]

2006

Yongzheng Liang, Jürgen Jähnert, and Paul Christ. Towards multi-dimensional scenario-driven process chain. Proceedings of the 17th World Wireless Research Forum Meeting, November 2006. [ bib ]

Jochen Kögel and Sebastian Kiesel. Security Impact of DNS Delegation Structure and Configuration Problems. Beiträge zum Essener Workshop zur Netzwerksicherheit (EWNS) 2006, October 2006. [ bib | pdf | www ]

Sven Grottke, Thomas Richter, and Ruedi Seiler. Apriori Rate Allocation in Wavelet-Based Image Compression. In Chinacom 2006, Beijing, China, October 2006. [ bib | external link www ]

We present a rate allocation scheme that pre-computes op- timal quantization bucket sizes based on a mathematical model of wavelet transformed natural images prior entropy encoding. We combine our scheme with the JPEG2000 em- bedded rate-allocator and find in our experiments that it is able to increase JPEG2000 encoding speed by a factor of two without losing image quality.

Zhikui Chen, Jürgen Jähnert, and Yan Tang. Analysis and Application of a QoS Scheme for WLAN Real-Time Video Communications. WMC 2006, World Mobile Congress, Munich, Germany, July 2006. [ bib ]

Zhikui Chen, Yan Tang, and Paul Christ. A cross-layer design for 4g wireless real-time video communication. ASWN 2006, Application and Services in Wireless Network, Berlin, May 2006. [ bib ]

Zhikui Chen, Yan Tang, and Paul Christ. H.264 Based Video Transmission in Cross-layer Wireless Communication Systems. World Wireless Congress 2006, May 2006. [ bib ]

Zhikui Chen, Yan Tang, and Paul Christ. Source Channel Adaptation as a QoS strategy for DAIDALOS. International Conference of Telecommunication 2006, May 2006. [ bib ]

Martin Waldburger, Cristian Morariu, Peter Racz, Jürgen Jähnert, Stefan Wesner, and Burkhard Stiller. Grids in a Mobile World: Akogrimos Network and Business Views. IFI Technical Report, University of Zürich, No. ifi-2006.05, April 2006. [ bib ]

C. Werner, Y. Liang, J. Jähnert, and M. Ebner. Daidalos - A scenario based approach from scenarios towards integration. IST Mobile Summit 2006, 2006. [ bib ]

David Lutz and Ruth del Campo. Bridging the Gap between Privacy and Security in Multi-Domain-Federations with Identity Tokens. MobiQuitous 06, 3rd Annual International Conference on Mobile and Ubiquitous Systems, 2006. [ bib ]

R.L. Aguiar, J. Jähnert, A.F. Gomez Skarmeta, and C. Hauser. Identity Management in Federated Telecommunications Systems. Proceedings of the Workshop on Standards for Privacy in User-Centric Identity Management, 2006. [ bib ]

Zhikui Chen, Jürgen Jähnert, and Yan Tang. Analysis and Application of a QoS Scheme for WLAN Real-Time Video Communications. World Mobile Congress Munich, Germany, 2006. [ bib ]

2005

Zhikui Chen and Paul Christ. Improving Cross-layer QoS Performance in 4G Mobile Communications. Global Mobile Congress 2005, October 2005. [ bib ]

A. Cuevas et.al. Usability and Evaluation of a Deployed 4G Network Prototype. Journal of Communications and Networks, June 2005. [ bib ]

Gustavo Carneiro, Carlos Garcia, Pedro Neves, Zhikui Chen, Michelle Wetterwald, Manuel Ricardo, Pablo Serrano, Susana Sargento, and Albert Banchs. The DAIDALOS Architecture for QoS over Heterogeneous Wireless Networks. IST Mobile & Wireless Summit 2005, June 2005. [ bib ]

J. Jähnert et.al. The Pure-IP Moby Dick 4G Architecture. Elsevier Computer Communication Journal, 2005. [ bib ]

David Boehringer, Albrecht Mangler, and Barbara Burr. Vorlesungsaufzeichnungen an der Universität Stuttgart. In Holger Horz, Wolfgang Hürst, Thomas Ottmann, Christoph Rensing, and Stephan Trahasch, editors, eLectures - Einsatzmöglichkeiten, Herausforderungen und Forschungsperspektiven, pages 13-18, 2005. [ bib | external link www ]

B. Burr, Boehringer D., and P. Göhner. e-Learning and e-Teaching: Media Development at the Universität Stuttgart. In G. König., H. Lehmann, and R. Köhring, editors, Proceedings of the ISPRS working group VI/1 - VI/2, Tools and Techniques for E-Learning, volume XXXVI-6/W30. International Archives of Photogrammetry, 2005. [ bib ]

Zhikui Chen and Paul Christ. Improving the Radio Link Layer QoS Performance for Bluetooth Real-time Video Communications. Wireless telecommunication Symposium, 2005. [ bib ]

2004

A. Cuevas et.al. Field Evaluation of a 4G True-IP network. IST Mobile Communications Summit 2004, June 2004. [ bib ]

R. Aguiar, D. Bijwaard, J. Jähnert, P. Christ, and H. Einsiedler. Designing Networks for the Delivery of Advanced Flexible Personal Services: the Daidalos Approach. IST Mobile Communications Summit, June 2004. [ bib ]

W. Lu, J. Zhou, P. Kurtansky, and J. Jähnert. Charging in the Next Generation Mobile Internet. IST Mobile Communications Summit 2004, June 2004. [ bib ]

P. Kurtansky, Hasan Hasan, B. Stiller, D. Singh, S. Zander, A. Cuevas, J. Jähnert, and J. Zhou. Extensions of AAA for Future IP Networks. IEEE Wireless ans Communicatios Networking Conference 2004, March 2004. [ bib ]

J. Jähnert and H.-W. Kim. The Moby Dick Approach Towards 4G Networks. Next Generation Teletraffic and Wired/Wireless Advanced Networking Conference 2004, February 2004. [ bib ]

D. Boehringer, B. Burr, and P. Göhner. e-learning and e-teaching - Media Development at the Universität Stuttgart. In ETH World Workshop "Best Practices of ICT Use in a University Environment", 2004. [ bib ]

D. Boehringer, B. Burr, P. Göhner, and A. Töpfer. E-Learning-Programme der Universität Stuttgart. In C. Bremer and K. Kohl, editors, E-Learning-Strategien und E-Learning-Kompetenzen an Hochschulen, pages 209-219, Bielefeld, 2004. [ bib ]

Y. Liang, P. Christ, J. Jähnert, A. Sarma, and T. Melia. Taming Monsters like Daidalos. ISSRE04 Workshop on Integrated-reliability with Telecommunications and UML Languages, 2004. [ bib ]

2003

J. Jähnert. Moby Dick. 3rd International Moby Dick Summit, November 2003. [ bib ]

Nevil Brownlee, Paul Christ, Jürgen Jähnert, Yongzheng Liang, Krishna Srinivasan, and Jie Zhou. MobyDick FlowVis - Using NeTraMet for distributed prorocol analysis in a 4G network environment. IPOM 2003, October 2003. [ bib ]

J. Jähnert. IPv6: A Mobility Enabler. Online 2003, September 2003. [ bib ]

J. Jähnert. Problem Statement: Metering and Accounting in the full-IP 4G environment. Third International Workshop on Internet Charging and QoS Technology, September 2003. [ bib ]

Jürgen Jähnert. Cost-efficient Metering and Accounting for 4G networks. 18th International Teletraffic Congress, September 2003. [ bib ]

L. Burgstahler, K. Dolzer, C. Hauser, J. Jähnert, S. Junghans, C. Macian, and W. Payer. Beyond Technology: The Missing Pieces for QoS Success. ACM SIGCOM RIPQoS workshop, August 2003. [ bib ]

Jürgen Jähnert. Game Theory - A concept pushing Wireless Internet Access towards a commercialized IP-dominating network. Curso de Verano Universidad Carlos III, June 2003. [ bib ]

J. Jähnert. Overview of the 6WinIT project. 3rd International Moby Dick Summit, 2003. [ bib ]

Wenhui Zhang, Jürgen Jähnert, and Klaus Dolzer. Design and Evaluation of A Handover Decision Strategy for 4th Generation Mobile Networks. IEEE VTC, 2003. [ bib ]

2002

Jürgen Jähnert. The Moby Dick Architecture. Moby Dick Summit, Madrid, November 2002. [ bib ]

M. Liebsch, X. Perez, R. Schmitz, A. Sarma, Jürgen Jähnert, S. Tessier, M. Wetterwald, and I. Soto:. Solutions for IPv6-based mobility in the EU project Moby Dick. World Telecom Congress, Paris, September 2002. [ bib ]

Hasan Hasan and Jürgen Jähnert et.al. The Design of an Extended AAAC Architecture. MobileSummit, June 2002. [ bib ]

J. Angelopoulos, D. Boettle, Paul Christ, Jürgen Jähnert, H.-C. Leligou, and S. Wahl. Design and implementation of a DiffServ enabled HFC system offering strict QoS support. European Transactions on Telecommunication Journal Volume 6, 2002. [ bib ]

Jürgen Jähnert. Preisstrategien für drahtlosen Internetzugang der nächsten Generation. Diplomarbeit an der Fachhochschule für Technik und Wirtschaft Berlin, 2002. [ bib ]

Jürgen Jähnert. Price Strategies for next Generation Wireless Internet Access. Invited talk at University Carlos III, Madrid, Mai, 2002. [ bib ]

2001

Victor Marques, Rui Aguiar, Jürgen Jähnert, Karl Jonas, Marco Liebsch, Hans Einsiedler, and Francisco Fontes. An Heterogeneous Mobile IP QoS-aware Network. accepted for oral presentation at CRC, November 2001. [ bib ]

H. Einsiedler, Jürgen Jähnert, K. Jonas, M. Liebsch, and R. Schmitz. Mobility Support for a Future Communication Architecture. Mobile Summit 2001, September 2001. [ bib ]

Hasan Hasan, Jürgen Jähnert, S. Zander, and B. Stiller:. Authentication, Authorisation, Accounting, and Charging for the Mobile Internet. Mobile Summit 2001, September 2001. [ bib ]

V. Marques, R. Aguiar, F. Fontes, Jürgen Jähnert, and H. Einsiedler. Enabling IP QoS in Mobile Environments. Mobile Summit 2001, September 2001. [ bib ]

Hasan Hasan, Jürgen Jähnert, S. Zander, and B. Stiller. Authentication, Authorization, Accounting, and Charging for the Mobile Internet. TIK Report No. 114, Computer Engineering and Networks Laboratory TIK, Swiss Federal Institute of Technology ETH Zürich, June 2001. [ bib ]

H. Einsiedler, R. L. Aguiar, Jürgen Jähnert, K. Jonas, M. Liebsch, R. Schmitz, P. Pacyna, J.Gozdecki, Z. Papir, J. I. Moreno, and I. Soto. The MobyDick Project: A Mobile Heterogeneous All-IP Architecture. Advanced Technologies, Applications and Market Strategies for 3G, 2001. [ bib ]

Paul Christ and Jürgen Jähnert. How will the Internet Survive the Mobility Shock? Mobility for All-IP Networks - Mobile IP Workshop, 2001. [ bib ]

2000

Jürgen Jähnert (RUS), S. Wahl (Alcatel Stuttgart), and H.C.Leligou (NTUA). Provision of QoS for legacy IP applications in an ATM over HFC access network. Interworking 2000, October 2000. [ bib ]

S. Wahl, H.C. Leligou, and Jürgen Jähnert. Architecture and Experiences of a Multi-Service HFC Network. Conference on High Performance Switching and Routing, June 2000. [ bib ]

1999

C. Guillemot, Paul Christ, S. Wesner, and A. Klemets. RTP Payload Format for MPEG-4 with Scaleable & Flexible Error Resiliency. Internet Draft, 1999. [ bib ]

C. Guillemot, S. Wesner, and Paul Christ. Integrating MPEG-4 into the Internet. ECMAST99, 1999. [ bib ]

Jürgen Jähnert and Holger Fahner. The impact on high-speed access network on residential users. Telecom99, 1999. [ bib ]

Michael M. Resch, Dirk Rantzau, and Robert Stoy. Metacomputing experience in a transatlantic wide area application test-bed. Future Generation Computer Systems, FGCS 15, 1999. [ bib ]

1998

C. Guillemot, Paul Christ, and S. Wesner. RTP Generic Payload with Scaleable & Flexible Error Resiliency. Internet Draft, November 1998. [ bib ]

Paul Christ, Ch. Guillemot, and S. Wesner. RTSP-based Stream Control in MPEG-4. Internet Draft, November 1998. [ bib ]

Paul Christ, Jürgen Jähnert, Holger Fahner, A. Giannitrapani, S. Wesner, and W. Li. ATM-over-HFC-based Access To the Internet for Residential Users. Zweite ITG Anwender-Fachtagung Internet - frischer Wind in der Telekomunikation - Perspektiven für Anwender und Anbieter, October 1998. [ bib ]

Jürgen Jähnert et.al. Interworking Technology and Applications ATM-over-HFC-based Access to the Internet for Residential Users. Interworking98, Ottawa, July 1998. [ bib ]

D. Boettle, S. Wahl, B. Cesar, and Jürgen Jähnert. Broadband Access via Hybrid Fibre Coax Sytems. ICON 1998, 1998. [ bib ]

Th. Eickermann, J. Henrichs, M. Resch, R. Stoy, and R. Voelpel. Metacomputing in gigabit environments: Networks, tools and applications. Parallel Computing 24, 1998. [ bib ]

1997

D. Böttle and Holger Fahner. Introducing Broadband Services on HFC Networks. International Conference on Broadband Strategies London, December 1997. [ bib ]

Holger Fahner, G.Ramlot, and D.Böttle. ATHOC Trials: Architecture and First Results. 8th ACTS Concertation Meeting, Brussels, December 1997. [ bib ]

D. Böttle, Holger Fahner, and Paul Christ et.al. ATM Applications over Hybrid Fibre Coax Trials. International Switching Symposium 97, Toronto, September 1997. [ bib ]

G. Maiß and Holger Fahner. MBone im B-Win. DFN Mitteilungen Heft Nr.44, June 1997. [ bib ]

Paul Christ, Holger Fahner, W. Li, and Jürgen Jähnert. Signalling Issues for IP over ATM in a HFC Environment. 6th ACTS Concertation Meeting, Brussels, June 1997. [ bib ]

1996

Holger Fahner and Peter Feil. Tutorial: MBone (IP Multicast), Grundlagen, Tools, Anwendungen. Opennet'96 Konferenz, Berlin, November 1996. [ bib ]

1995

Holger Fahner. Das europäische ATM. Benutzerinformation des Rechenzentrums, Bi Ausgabe 3/95, Rechenzentrum Universität Stuttgart, March 1995. [ bib ]

Peter Feil. High Speed Networking in Europe Today: RUS, Pilot User of the European ATM Network. Broadband Islands 95, Proceedings of the fourth International Conference, 1995. [ bib ]

1994

Holger Fahner. PAGEIN Demonstration with a Wide Area CBDS/SMDS Configured Network. RACE Workshop: ATM in Reality, Brussels Juli 1994, July 1994. [ bib ]


Impressum / Legal Terms. © 2016 Universität Stuttgart