[Humanist] 23.283 cfp: Digital Human Faces
Humanist Discussion Group
willard.mccarty at mccarty.org.uk
Thu Sep 10 07:22:05 CEST 2009
Humanist Discussion Group, Vol. 23, No. 283.
Centre for Computing in the Humanities, King's College London
Submit to: humanist at lists.digitalhumanities.org
Date: Tue, 8 Sep 2009 22:05:12 +0100
From: Catherine Pelachaud <catherine.pelachaud at telecom-paristech.fr>
Subject: special issue IEEE CG&A - Digital Human Faces: fromCreation to Emotion
Special Issue on Digitial Human Faces: From Creation to Emotion
/IEEE Computer Graphics and Applications
/Submission deadline: 23 Nov. 2009
Target publication date: July/Aug. 2010
Faces are an important vector of communication. Through facial
expressions, gaze behaviors, and head movements, faces convey
information on not only a person’s emotional state and attitude but also
discursive, pragmatic, and syntactic elements. The expressions result
from subtle muscular contractions and wrinkle formation, and we perceive
them through the complex filter of subsurface scattering and other
nontrivial light reflections.
Lately, there has been much interest in modeling 3D faces and their
expressions. Research has covered automatic or interactive generation of
3D geometry as well as rendering and animation techniques. This research
has numerous applications. One type of application involves the creation
and animation of virtual actors in films and video games. New rendering
techniques ensure highly realistic skin models. Motion capture with or
without markers is applied to animate the body and the face. The quality
can be precise enough to capture real actor performances as well as the
slightest movements in emotional expressions.
Another type of application involves the creation of autonomous
agents—in particular, /embodied conversational agents/ (ECAs),
autonomous entities with communicative and emotional capabilities. ECAs
serve as Web assistants, pedagogical agents, or even companions.
Researchers have proposed models to specify and control ECA behavior.
This special issue will broadly cover domains linked to 3D faces and
their creation, rendering, and animation. In particular, it aims to
gather excellent work from the computer graphics and ECA communities.
Possible topics include, but aren’t limited to, the following:
* /Facial animation/.
* /Face and performance capture/ (marker based or markerless).
* /Geometric modeling of faces/ (automatic or interactive).
* /Face and skin rendering techniques/ (subsurface scattering and
* /Expressing emotion/. How do you go beyond the expression of the
six basic emotions? How do you represent the large palette of
facial expressions of emotions? How do you model dynamic
expressions? How do you model the expression of empathy?
* /Complex expressions/. Expressions can arise from the blending of
emotions such as the superposition of emotions or the masking of
one emotion by another. Expressions can simultaneously convey
* /Communicative expressions/. Faces don’t solely portray emotions;
they convey a variety of communicative functions such as visual
prosody and performative functions. How do you model and represent
such expressions? How do you capture subtle variations in the
production of them? How do you model mechanisms that are
synchronous with speech?
* /Social signals/. Communication is socially embedded. Agents and
virtual actors must be socially aware. People often use smiles and
eyebrow flashes to signal their attitude toward others. How do you
model facial social signals?
Articles should be no more than 8,000 words, with each figure counting
as 200 words. Cite only the 12 most relevant references, and consider
providing technical background in sidebars for nonexpert readers. Color
images are preferable and should be limited to 10. Visit /CG&A/ style
and length guidelines at www.computer.org/cga/author.html
Please submit your article using the online manuscript submission
service at https://mc.manuscriptcentral.com/cs-ieee. When uploading your
article, select the appropriate special-issue title under the category
“Manuscript Type.” Also include complete contact information for all
authors. If you have any questions about submitting your article,
contact the peer review coordinator at cga-ma at computer.org
<mailto:cga-ma at computer.org>.
Please direct any correspondence before submission to the guest editors:
* Catherine Pelachaud, catherine.pelachaud at telecom-paristech.fr
<mailto:catherine.pelachaud at telecom-paristech.fr>
* Tamy Boubekeur, tamy.boubekeur at telecom-paristech.fr
<mailto:tamy.boubekeur at telecom-paristech.fr>
CNRS - LTCI UMR 5141
Institut TELECOM - TELECOM ParisTech
37 rue Dareau
75014 Paris FRANCE
tel: +33 (0)1 45 81 75 93
http://www.tsi.enst.fr/~pelachau catherine.pelachaud at telecom-paristech.fr
More information about the Humanist