Ir ao conteúdo

unreal engine 3.0,cry engine 2.0 ,source engine 2.0 e 3.0 e outras


sandrovilanova

Posts recomendados

Postado

(atualizado 11/05 com video da cry engine 2.0) topico criado para mostrar as diversas engines(motores graficos)utilazadas nos jogos de nova geração e pc's,comecaremos com a unreal engine 3.0 que apesar de mostrar gears of war como expoente ,comparando a descricao dada da engine nao mostra seu pontecial total.

Unreal Engine 3

Overview

Unreal Engine 3 is a complete game development framework for next-generation consoles and DirectX9-equipped PCs, providing the vast array of core technologies, content creation tools, and support infrastructure required by top game developers.

ue3_p_embry3-s.jpg

ue3_p_geist-s.jpg

ue3_p_soldier-s.jpg

ue3_p_bezerk-s.jpg

ue3_p_embry2-s.jpg

Every aspect of the Unreal Engine has been designed with ease of content creation and programming in mind, with the goal of putting as much power as possible in the hands of artists and designers to develop assets in a visual environment with minimal programmer assistance; and to give programmers a highly modular, scalable and extensible framework for building, testing, and shipping games in a wide range of genres.

Rendering Features

  • Multi-threaded rendering system – Gemini.
  • 64-bit color High Dynamic Range rendering pipeline. The gamma-correct, linear color space renderer provides for immaculate color precision while supporting a wide range of post processing effects such as light blooms, lenticular halos, and depth-of-field.
  • Support for all modern per-pixel lighting and rendering techniques including normal mapped, parameterized Phong lighting; custom artist controlled per material lighting models including anisotropic effects; virtual displacement mapping; light attenuation functions; pre-computed shadow masks; directional light maps; and pre-computed bump-granularity self-shadowing using spherical harmonic maps.
  • Advanced Dynamic Shadowing. Unreal Engine 3 provides full support for four shadowing techniques:
    • Dynamic stencil buffered shadow volumes supporting fully dynamic, moving light sources casting accurate shadows on all objects in the scene.
    • Dynamic characters casting dynamic soft shadows on the scene using shadow buffers. Shadow buffer filtering takes samples on a jittered disc that are rotated per-pixel to detect shadow penumbras. Dynamic branching is then used to refine shadow coverage in penumbra regions.
    • Ultra high quality and high performance pre-computed shadow masks allow offline processing of static light interactions, while retaining fully dynamic specular lighting and reflections.
    • Directional Light Mapping enables the static shadowing and diffuse normal-mapped lighting of an unlimited number of lights to be precomputed and stored into a single set of texture maps, enabling very large light counts in high-performance scenes.

    [*]All of the supported shadow techniques are visually compatible and may be mixed freely at the artist's discretion, and may be combined with colored attenuation functions enabling properly shadowed directional, spotlight, and projector lighting effects.

    [*]Volumetric environmental effects including height fog.

    [*]Full support for seamlessly interconnected indoor and outdoor environments with dynamic per-pixel lighting and shadowing supported everywhere.

    [*]Split-screen rendering.

    [*]High-resolution screenshot support.

    [*]Post-processing effects: motion blur, depth of field, and bloom.

    [*]Artists can build terrain using a dynamically-deformable base height map extended by multiple layers of smoothly-blended materials including displacement maps, normal maps and arbitrarily complex materials, dynamic LOD-based tessellation, and vegetation layers with procedurally-placed meshes. Further, the terrain system supports artist-controlled layers of procedural weathering, for example, grass and vegetation on the flat areas of terrain, rock on high slopes, and snow at the peaks.

    [*]Powerful material system, enabling artists to create arbitrarily complex realtime shaders on-the-fly in a visual interface that is comparable in power to the non-realtime functionality provided by Maya.

    [*]The material framework is modular, so programmers can add not just new shader programs, but shader components which artists can connect with other components on-the-fly, resulting in dynamic composition and compilation of shader code.

    [*]Extensible particle system with visual editor – UnrealCascade – supporting particle physics and environmental effects.

Postado

ue3_softshadows-s.jpg

ue3_dynalight-s.jpg

Characters in Unreal Engine 3 produce dynamic soft shadows with self-shadowing.

Shadows with fuzzy attenuation sweep around the scene as the torch moves.

ue3_complexmat-s.jpg

ue3_fbdistort-s.jpg

Artist-authored panning and iridescent materials all seamlessly combine with per-pixel lighting and shadowing.

Normal-mapped translucent object distorts and attenuates the frame buffer, simulating ray-traced reflections.

Postado

ue3_terrain-s.jpg

ue3_fog-s.jpg

The fuzzy shadows of clouds smoothly roll across the hills, while the windmill's rotating blades cast shadows on the ground beneath.

Soft-shadowed character standing in a bank of volumetric fog.

ue3_hdrglow-s.jpg

ue3_spec-s.jpg

Light blooms using 64-bit color High Dynamic Range color.

The subtle interplay of normal mapped diffuse and specular lighting with fuzzy shadows.

ue3_cascade_new-s.jpg

Realtime particles in “UnrealCascade”, the modular, visual particle system editor.

Postado

Distributed Computing Normal Map Generation Tool

Most of our characters are built from two meshes: a realtime mesh with thousands of triangles, and a detail mesh with millions of triangles. We provide a distributed-computing application which raytraces the detail mesh and, from its high-polygon geometry, generates a normal map that is applied to the realtime mesh when rendering. The result is in-game objects with all of the lighting detail of the high poly mesh, but that are still easily rendered in real time.

ue3_char2-s.jpg

ue3_char1-s.jpg

5,287 triangle in-game mesh in 3D Studio Max.

Purely geometric 2,000,000 triangle detail mesh in 3D Studio Max.

ue3_char3-s.jpg

Resulting normal-mapped mesh in game.

ue3_over1-s.jpg

ue3_over2-s.jpg

Over 100 million triangles of source content contribute to the normal maps which light this outdoor scene.

Wireframe reveals memory-efficient content comprising under 500,000 triangles.

Postado

Audio

  • Support for all major output formats of each platform, including 5.1 surround sound and certification-quality Dolby Digital.
  • 3D sound positioning, spatialization, Doppler shift.
  • Cross-platform DSP effects (reverb, etc.).
  • Compression on all platforms; SIMD optimized Ogg Vorbis on PC.
  • Multi-channel playback (4.0, 5.1, 6.1 and 7.1).
  • Microphone input.
  • Visual Sound Tool in UnrealEd gives sound designers complete control over sounds, sound levels, sequencing, looping, filtering, modulation, pitch shift, and randomization. Sounds parameters are separated from code to an extent that sound designers can control all sounds associated with gameplay, cinematics and animation sequences.

ue3_soundcue-s.jpg

UnrealEd’s sound cue editor.

Physics

  • Powered by Ageia PhysX.
  • Rigid body physics system supporting player interaction with physical game object, ragdoll character animation, complex vehicles, and dismemberable objects.
  • All renderable materials have physical properties such as friction.
  • Physics-driven sound.
  • Fully integrated support for physics-based vehicles, including player control, AI, and networking.
  • Gameplay-driven physical animation – capable of playing animations while being influenced by physics.
  • Cloth simulation.
  • UnrealPhAT, the Visual physics modeling tool built into UnrealEd that supports creation of optimized collision primitives for models and skeletal animated meshes; constraint editing; and interactive physics simulation and tweaking in-editor.

    ue3_ragdoll-s.jpg
    ue3_phat_rap-s.jpg
    Ragdoll rigid body dynamics drive all objects in this scene, including these skeletal-animated objects.
    Unreal Engine 3 improves on the vehicle physics features and provides an intuitive and interactive physics setup tool.
    ue3_phat_char-s.jpg
    UnrealPhAT enables the creation and adjustment of skeletal setup, constraints, and breaking forces in realtime.


Postado

Animation

  • Skeletal animation system supporting up to 4 bone influences per vertex and very complex skeletons.
  • Full mesh and bone LOD support.
  • AnimSet Viewer tool for browsing and organizing animations and meshes:
    • Ability to add game-specific notifications at specific points in the animation.
    • Tool for graphically placing ‘Sockets’ on bones to be used for attaching objects to the skeleton in the game, complete with preview.
    • Ability to preview ‘overlay’ meshes based on the same skeleton (e.g. armor).

    ue3_anim_socket.jpg

    “AnimSet Viewer” example with Socket Manager.

  • Animation is driven by an “AnimTree” - a tree of animation nodes including:
    • Blend controllers, performing an n-way blend between nested animation objects.
    • Data-driven controllers, encapsulating motion capture or hand animation data.
    • Physics controllers, tying into the rigid body dynamics engine for ragdoll player and NPC animation and physical response to impulses.
    • Procedural skeletal controllers, for game features such as having an NPC's head and eyes track a player walking through the level.
    • Inverse Kinematics solver for calculating limb pose based on a goal location (e.g. for foot placement).

    [*]AnimTree Editor allows programmers or animators to create complex blends and controller setups and preview them in realtime in the editor.

ue3_anim_tree.jpg

“AnimTree Editor” example.

  • New node and controller types can be easily added for game specific control.
  • Export tools for 3D Studio Max, Maya and XSI for bringing weighted meshes, skeletons, and animation sequences into the engine.

Game Scripting and Cinematics

  • <LI class=body>UnrealKismet, our visual scripting system:
    • <LI class=body>Gives artists and level designers virtually limitless control over how a level will play without touching a single line of code.
    • By connecting together simple events and actions created by programmers, everything from simple behaviours to complete gameplay prototypes can be assembled quickly.
    • UnrealKismet supports hierarchies of scripts for organizing very complex sequences into manageable units.

ue3_kismet1-s.jpg

ue3_kismet2-s.jpg

“UnrealKismet” visual scripting system example 1.

“UnrealKismet” visual scripting system example 2.

  • <LI class=body>UnrealMatinee, for keyframing properties over time and creating in-game cinematics:
    • <LI class=body>Track based system, with support for controlling movement, modifying properties, playing animation and sound, making camera-cuts, changing FOV, fading etc. <LI class=body>Preview and scrubbing of sequence completely in-editor for instant feedback. <LI class=body>Curve editor, for getting fine control of movements or properties over time. <LI class=body>Ability to connect multiple actors to an UnrealMatinee group (e.g. for dimming 10 lights at once). <LI class=body>UnrealMatinee data can be shared across multiple instances (e.g. using the same animation for every door in a level).
    • Any property can trivially be exposed for UnrealMatinee to control. New track types can also be easily added.

[*]UnrealMatinee and UnrealKismet are tightly integrated. Arbitrary gameplay events can be triggered at specific points in a sequence, and level designers have complete control over playing, stopping and reversing sequence playback.

ue3_matinee.jpg

UnrealEd's "Matinee" timeline-based sequencing tool.

User Interface System

  • Flexible system for creating scenes that contain widgets for displaying information and processing events, such as Menus and HUDs.
  • Visual UI Editor tool allows for easy creation of UI content, including creating skins for custom widget styles; binding widgets to in-game data; and responding to events

Postado

UnrealEd Content Creation Tool

  • The Unreal Editor (UnrealEd) is a pure “What You See Is What You Get” content creation tool filling the void between 3D Studio Max and Maya, and shippable game content.
  • A powerful browser framework for finding, viewing, and organizing game assets of all types.
  • Visual placement and editing of gameplay objects such as players, NPCs, inventory items, AI path nodes, and light sources – with a full realtime view of their appearance, including 100% dynamic shadowing.
  • Includes a data-driven property editing framework, allowing level designers to easily customize any game object, and programmers to expose new customizable properties to designers via script.
  • Realtime terrain editing tools allowing artists to elevate terrain, paint alpha layers onto terrain to control layer blending and decoration layers, collision data, and displacement maps.
  • Visual Material Editor. By visually connecting the color, alpha and coordinate outputs of textures and programmer-defined material components, artists can create materials ranging from simple layered blends to extremely complex materials and dynamically interacting with scene lights.
  • Animation tool enables artists to import models, skeletons, and animations, and to tie them to in-game events such as sounds and script notifications.
  • In-editor “Play Here” button puts gameplay just one mouse click and a fraction of a second away. Here, you can test gameplay in-editor in one window while modifying objects and rearranging geometry in another.
  • Plug-ins for 3D Studio Max and Maya to bring models into the Unreal engine with mesh topology, mapping coordinates, smoothing groups, material names, skeleton structure, and skeletal animation data.
  • Suport for COLLADA import path for meshes and animation.
  • Fully integrated source control, so that artists and level designers can check out content packages, modify, and check in from within the editor.
  • All the other niceties you'd expect from a modern content editing tool: Multi-level undo/redo, drag-and-drop, copy-and-paste; customizable key and color configuration; viewport management.
  • Every Unreal Engine license includes the right to redistribute UnrealEd publicly, enabling teams to release the content creation tools along with their game to the mod community. Mod support has been a major factor behind the success of many prominent PC games today, and we anticipate that support for PC-based mod development may be a significant factor in future console games as well.

ue3_editorprop-s.jpg

ue3_browser-s.jpg

UnrealEd, the central hub that binds the various tools into a seamless package.

Visual interface for content browsing, searching, and management.

ue3_terraedit-s.jpg

ue3_mated-s.jpg

Visual terrain editor with realtime deformation, vegetation layers, and procedural layering.

Visual material editor enables artists to create elaborate shaders previously only accessible to programmers.

ue3_playineditor-s.jpg

"Play In Editor" allows artists and designers to immediately see their changes in-game.

Postado

Localization

  • Unreal Engine 3 content and code are localization-aware, with a simple framework for externalizing all game text, audio, images, and videos. Localized assets can be cooked for seek-free loading.
  • Unreal Engine 3 is based on the Unicode character set, and has full support for 16-bit Unicode fonts and text input, including importing TrueType fonts into renderable bitmap fonts.
  • Our games have shipped in 9 languages including Japanese, Chinese, and Korean.

ue2_korean1-s.jpg

Unicode-based localization framework powering UT2004 Korean version

Game Framework & Artificial Intelligence

  • <LI class=body>An object-oriented gameplay framework is provided supporting common game objects such as players, NPC's, inventory, weapons, and triggers. <LI class=body>Rich multi-level AI system supporting path-finding, complex level navigation, individual decision making, and team-based AI.
    • <LI class=body>Pathfinding framework with full awareness of common game objects such as triggers, doors and elevators, allowing for complex navigation scenarios where an NPC will press switches, open doors, and navigate around temporary obstructions in order to reach its destination. <LI class=body>Navigation framework with support for short-term tactical combat, cover, and navigation off the path network.
    • Team-based AI framework suitable for first-person shooters, third-person shooters, and tactical combat games. The team-based AI framework provides support for team coordination, objective management, and long-term goals.

[*]AI paths are viewable and editable by level designers in UnrealEd, allowing customization and hinting.

Networking

  • Internet and LAN play has been a hallmark of Epic's past competitive games such as Unreal Tournament 2004. The Unreal Engine has long provided a flexible and high-level network architecture suitable to many genres of games.
  • Internet and LAN play is fully supported on PC and all console platforms.
  • Unreal Engine gameplay network programming is high-level and data-driven, allowing UnrealScript game code to specify variables and functions to be replicated between client and server to maintain a consistent approximation of game state. The low-level game networking transport is UDP-based and combines reliable and unreliable transmission schemes to optimize gameplay, even in low-bandwidth and high-latency scenarios.
  • Client-server model supporting up to 64 players as provided. Also supports non-dedicated server (peer-to-peer mode) with up to 16 players.
  • Supports network play between different platforms (i.e. dedicated PC serving console clients; Windows, MacOS and Linux clients playing together.)
  • All gameplay features are supported in network play, enabling vehicle-based multiplayer games, competitive team games with NPC's or bots, cooperative play in a single player focused game, and so on. Support for auto-downloading and caching content, including cross-platform compatible UnrealScript code. This feature enables everything from user-created maps, to bonus packs, to complete game mods to be downloaded on the fly. In-game server browser GUI for finding and querying servers, keeping track of favorites, in-game chat, etc.
  • A “master server” component is provided for tracking worldwide servers, providing filtered server lists to players, etc.; worldwide game stats tracking system.
  • Please note that we don't provide a server or networking framework suitable for massively multiplayer games. Though such a task is a multi man-year engineering effort, several teams using the Unreal Engine have done so (including Sigil Games Online for Vanguard and NCSoft for Lineage II), demonstrating the feasibility of using the Unreal Engine as a MMORPG game client and tools pipeline, integrated with a proprietary server component.

ue25_browser-s.jpg

ue25_stats-s.jpg

Unreal Tournament 2004's in-game server browser

Framework for tracking worldwide player rankings in UT2004.

ue25_boom-s.jpg

Networking framework suitable even for the world's most competitive action games.

Programming Features

  • Unreal Engine 3 includes example content and 100% of the source code for the engine, editor, Max/Maya exporters, and the game-specific code for our internally-developed games.
  • Extensible, object-oriented C++ engine with software framework for persistence, dynamic loading of code and content, portability, debugging.
  • UnrealScript gameplay scripting language provides automatic support for metadata; persistence with very flexible file format backwards-compatibility; support for exposing script properties to level designers in UnrealEd; a GUI-based script debugger; and native language support for many concepts important in gameplay programming, such as dynamically scoped state machines and time-based (latent) execution of code.
  • Modular material component interface for extending visual tool and adding new shader components usable by artists in the visual shader GUI.
  • Source control friendly software architecture, scalable to large teams and multi-platform projects.
  • Unreal Engine 3 is provided as one unified codebase that compiles on PC and all supported next-generation console platforms. All game content and data files are compatible across all supported platforms, for fast turnaround time between code and content development on PC, and playtesting on console or PC.
  • Seek-free DVD loading optimization pass for consoles, able to load levels at >80% of DVD's physical transfer rate.
  • Extensible content-streaming framework suitable for multithreaded background DVD streaming of resources and predefined groups of resources based on LOD or programmatic control.

Debugging and Performance Monitoring Tools

  • Memory tracking tools (including GPU memory).
  • Remote Control utility – extensible tool that runs alongside the game to allow for setting flags and running helper command

UnrealScript

  • UnrealScript gameplay scripting language is a high-level Java-like object oriented programming language.
  • Native language support for many concepts important in gameplay programming, such as dynamically scoped state machines and time-based (latent) execution of code.
  • Provides automatic support for metadata; persistence with very flexible file format backwards-compatibility; support for exposing script properties to level designers in UnrealEd.
  • UnrealScript debugger.

ue3_script2-s.jpg

UnrealScript provides for safe, easy game event programming.

Unreal Engine Integrated Partners

Epic Games' Integrated Partners Program establishes a formal business relationship between the game and middleware developer and select companies developing cross-platform technologies which integrate with, and are complementary to, Unreal Engine 3. As part of the program, Epic provides continuous Unreal Engine 3 source code access and full technical support to program members.

In addition, those companies that join Epic's partnership program agree to provide a high level of technical support for Unreal Engine 3 licensees through Epic’s established support channels, as well as keep implementations up-to-date with the latest software versions, and work with Epic on potential promotional and co-marketing efforts.

Such companies include FaceFX by OC3 Entertainment, SpeedTree by IDV, Inc., and Bink Video by RAD Game Tools – integrated and used by Epic in their games. Other products include AI middleware and cross-platform UI solutions, to name a few.

Proven Technology

Unreal Engine 3 was used to power games such as Epic’s recently released Gears of War and the upcoming Unreal Tournament 3, as well as many games by licensees that represent some of the best studios in the industry.

Typical Content Specifications

Here are the guidelines we're using in building content for our next Unreal Engine 3 based game. Different genres of games will have widely varying expectations of player counts, scene size, and performance, so these specifications should be regarded as one data point for one project rather than hard requirements for all.

Characters

For every major character and static mesh asset, we build two versions of the geometry: a renderable mesh with unique UV coordinates, and a detail mesh containing only geometry. We run the two meshes through the Unreal Engine 3 preprocessing tool and generate a high-res normal map for the renderable mesh, based on analyzing all of the geometry in the detail mesh.

  • <LI class=body>Renderable Mesh: We build renderable meshes with 3,000-12,000 triangles, based on the expectation of 5-20 visible characters in a game scene. <LI class=body>Detail Mesh: We build 1-8 million triangle detail meshes for typical characters. This is quite sufficient for generating 1-2 normal maps of resolution 2048x2048 per character.
  • Bones: The highest LOD version of our characters typically have 100-200 bones, and include articulated faces, hands, and fingers.

Normal Maps & Texture maps

We are authoring most character and world normal maps and texture maps at 2048x2048 resolution. We feel this is a good target for games running on mid-range PC's in the 2006 timeframe. Next-generation consoles may require reducing texture resolution by 2X, and low-end PC's up to 4X, depending on texture count and scene complexity.

Environments

Typical environments contain 1000-5000 total renderable objects, including static meshes and skeletal meshes. For reasonable performance on current 3D cards, we aim to keep the number of visible objects in any given scene to 300-1000 visible objects. Our larger scenes typically peak at 500,000 to 1,500,000 rendered triangles.

Lights

There are no hardcoded limits on light counts, but for performance we try to limit the number of large-radius lights affecting large scenes to 2-5, as each light/object interaction pair is costly due to the engine's high-precision per-pixel lighting and shadowing pipeline. Low-radius lights used for highlights and detail lighting on specific objects are significantly less costly than lights affecting the full scene.

Postado

Agora a estão esperada cry-engine 2 que sera demonstrada no crysis,(grande criador de expectativas desse ano)

CryENGINE™ 2

character_facial_editor1.jpg

Real time editing, bump mapping, static lights, network system, integrated physics system, shaders, shadows and a dynamic music system are just some of the state of-the-art features the CryENGINE™ 2 offers.

character_facial_editor2.jpg

The CryENGINE™ 2 comes complete with all of its internal tools and also includes the CryENGINE™ 2 Sandbox world editing system.

Licensees receive full source code and documentation for the engine and tools. Support is provided directly from the R & D Team that continuously develops the engine and can arrange teaching workshops for your team to increase the learning process.

The engine supports all video and hardware currently on the market. New hardware support is constantly added as it becomes available.

  • Polybump™ 2: Polybump™ 2 can be used as either as a standalone utility, or fully integrated with other tools such as 3DS Max™. This tool creates a high quality surface description that allows quick extraction of surface features like normal maps (tangent-space or object-space), displacement maps, unoccluded area direction, accessibility and other properties. The extracted information can be used to render Low poly models with surface detail almost making them look like the high-poly models but it will render much faster. The data is stored in a intermediate file format so it can be exported in different ways without doing the computation again. Very high polygon counts (e.g. 10 million triangles) are processed quite quickly.
  • Next Generation Real-Time Renderer: Our renderer provides seamless support for both indoor and outdoor environments on DirectX9 and 10, as well as support for next generation consoles such as the Xbox360 and PS3 (under development).


  • Real Time Lighting and Dynamic Soft Shadows
    : CryENGINE™ 2 features natural looking light sources, and creates soft shadows that dynamically respond to natural movements. It includes high-resolution, perspective correct and volumetric smooth-shadow implementations.

  • Volumetric, Layer and View Distance Fogging
    : Create clouds or fog banks which can hug the ground and realistically reduce both visibility and contrast, and properly interact with both dynamic lights and shadows, add depth and dimension to a landscape by reducing scene contrast and clarity for distant landmarks

  • Terrain 2.5D Ambient Occlusion Maps
    : On a per pixel level, approximates the amount of ambient (fill) light reaching an object (static or dynamic) depending on the amount of ambient occlusion created by the surrounding foliage and structures.

  • Normal Maps and Parallax Occlusion Maps
    : Normal maps are used to project the contour details of a highly detailed object onto a low polygon model by using a high frequency compressed (3DC/BC5) texture in place of the polygon’s surface normal in lighting calculations. CryEngine2 also supports parallax occlusion mapping to give a greater sense of depth to a surface texture applied to a polygon, such as could be used to realistically emphasize the relief surface structure of a brick wall, for example.

  • Real Time Ambient Maps
    : Pre-calculate the amount of ambient (fill) light which will be applied to indoor surfaces, to improve the quality of lighting when applying real-time per-pixel lighting and shadows. This means the current light position and color can be dynamically added to the fill light intensity applied to illuminate surfaces in interior spaces.

  • Subsurface Scattering
    : Simulates the diffusion and diffraction of light transmitted through translucent objects, like ice and jade; it can also be used to create natural looking skin or vegetation.

  • Eye Adaptation & High Dynamic Range (HDR) Lighting
    : Eye Adaptation is used to simulate the human eye’s adaptation to sudden or extreme changes in lighting conditions, like dark indoor environments suddenly transitioning to bright sunny outdoor environments, while HDR allows scenes with extreme brightness and contrast ranges to be more realistically rendered.

  • Motion Blur & Depth of Field
    : Motion Blur is used to simulate the visual effect of using a slow shutter speed when tracking fast moving objects or making quick camera movements. Blur can be applied both to individual objects (object based motion blur), and/or to an entire scene (screen based motion blur), while Depth of Field can be used to focus the viewer’s eye on a nearby object while subtly blurring objects in front or behind the point of focus.

  • Light Beams & Shafts
    : These are used to create visually stunning light beams and shadows when light intersects with solid or highly detailed geometry, and can generate “godray” effects under water.

  • High Quality 3D Ocean Technology
    : Dynamically modifies the ocean surface based on wind and wave direction, generating shoreline soft-clipping breakers automatically where the ocean meets the shore, depending on the shoreline contour and ocean depth, while our caustic simulation creates realistic looking moving shadows and highlights in underwater environments.

  • Advanced Shader Technology
    : A script system used to combine textures and math in different ways to create unique effects such as cloaked, wet, muddy, and/or frozen surfaces which can be layered together and combined with more basic shaders such as metallic and glassy and other visual effects. Supports real time per-pixel lighting, bumpy reflections, refractions, volumetric glow effects, animated textures, transparent computer displays, windows, bullet holes, and shinny surfaces. Included are many unique new shaders which take advantage of the efficiencies of the unified shader architecture of DirectX 10.

  • Terrain LOD Management Feature
    : This feature allows optimal usage of CPU and memory to display closer objects and terrain at a fine level of detail while enabling long view distances of over 8 kilometers.


  • Integrated Multi-threaded Physics Engine: Can be applied to almost everything in a level, including trees and vegetation, to realistically model reactions to forces like wind currents, explosions, gravity, friction and collisions with other objects, without the need of specialized coprocessing hardware. Also allows for character to ragdoll and ragdoll to character transitions.


  • Advanced Rope Physics
    : Bendable vegetation which responds to wind, rain or character movement, realistically interactive rope bridges, and physically driven creature tentacle animations are just some of the uses to which we’ve put our rope physics technology.

  • Interactive and Destructible Environments
    : Dynamically physicallize (using previously defined breaking or shattering characteristics) any arbitrary environmental object or shape, in order to destroy buildings, trees, or other objects, and then further interact with the resulting pieces.


  • Character Animation System: Our new character animation system considerably advances the state of the art in real-time human, model and vehicle animation. A fully integrated character editor allows animations to be previewed inside of CryENGINE Sandbox2, while our extremely powerful animation graph allows an animator to visually define the animation states of a character, and the allowable transitions between those states.


  • Character Individualisation System
    : The character pipeline uses a robust character attachment system which allows for attachment of skinned, animated, or physicallized attachments to the skeleton or polygonal faces of a character, to the extent you can even replace entire body parts such as heads, hands, or upper and lower body. A hardware based shape deformation system allows flexible variation of the character meshes. The system supports manually and even procedurally generated examples to ensure a small memory footprint. An additional variation system based on shaders is use for dirt, decals for clothes, and camouflage shaders for the skin.

  • Parametric Skeletal Animation
    : By blending example-motions based on user-defined parameters, we obtain responsive interactive control over a character with a focus on believability and the ability to adapt automatically and naturally to the changing circumstances of a game environment. This enables the character to travel at different speeds, follow paths where the direction changes smoothly or suddenly, move uphill or downhill, dynamically blend in varying amounts of hit reaction animation, and/or change the style of locomotion.

  • Procedural Motion Warping
    : Procedural algorithms like CCD-IK, analytic IK, example-based IK or physical simulations are used to augment pre-authored animations. All procedural methods have in common that a computer procedurally follows the steps in an algorithm to generate artificial motions. To avoid the typical computer-generated look when combining artificial and captured animations, we use a warping technique that can preserve the style and the content of the base motion, despite the transformations needed to comply with the constraints.

  • High Quality Animation Compression
    : Using our adaptive key frame compression technology, we can adjust the compression level to match the fidelity needed for any given animation while saving at least 90% of the RAM that would otherwise be consumed, without significant loss of motion fidelity.


  • Integrated CryENGINE Sandbox2 Editor: Run time engine is fully integrated into the CryENGINE Sandbox2 editor to give designers “What you see is what you play” functionality


  • Embedded Facial Animation Editor
    : The powerful new Facial Animation editing tool uses audio waveform analysis technology to automatically extract phonemes and other key features of speech in order to animate facial features and provide convincing lip sync. The sophisticated and convenient multiple joystick-based user interface allows expressions to be defined and combined in many powerful ways, then animated quickly and intuitively. Expressions and animations can be created once, and then seamlessly applied to multiple models. In conjunction with this system, a video tracking tool can be used to capture movements from an actor’s face using a standard video camera, and these movements transferred directly to the desired facial model in the editor, where the expressions and movements can be combined with the lip sync and/or further edited by an animator.

  • Integrated Vegetation and Terrain Cover Generation System
    : Allows procedurally placed vegetation to behave according to natural rules about the desirable and allowable ground slope, surface altitude, and allowable plant density to create believable natural environments at run time without requiring the level designer to custom place every blade of grass or tree. Vegetation can also absorb some color from the underlying terrain texture, to fit more naturally into the environment.

  • Advanced Terrain System with Integrated Voxel Objects Technology
    : Allows designers to place overhanging cliffs, caves or tunnels in their levels, and allows them to adjust the terrain detail on a per-sector basis to reduce overall polygon counts.

  • Flow Graph
    : A visual editing system which allows designers to set up events, triggers, and other game logic by connecting various logic boxes to each other with lines between their input and output gates, and defining their properties and state changes. This allows designers to build complex levels without needing to write C++ code or LUA scripts.

  • Advanced Soft Particle System and Integrated FX Editor
    : Simplifies the creation of extremely complex explosion, fire, smoke and other special effects using next generation soft particles, which in turn can be affected by collisions with any other objects, apply or receive forces such as wind or gravity, and can interact with lights and shadows.</B>:

  • Dynamic Time of Day Settings
    : Change the time of day dynamically during a game mission to reflect lighting conditions and sun/moon positions over any predefined 24 hour cycle, from a blue and foggy morning sunrise to a fiery orange sunset to a clear cold moonlit night.

  • Road and River Tools
    : These integrated tools greatly simplify the process of locally smoothing and leveling terrain and applying a tiled texture for the creation of paths, roads, or rivers through rugged landscapes.

  • Vehicle Editor
    : Allows designers to setup a vehicle’s damage system, including defining component damage and damage effects, define passenger seats and their connection to such vehicle features as the ability to control tank turrets and/or attached weapons, and provides full control over engine and physics parameters. It also exposes control of a vehicle’s exhaust and surface particle effects.


  • Sound and Music: The sound system in CryEngine2 introduces many new features and improvements with its data-driven concept. Each sound carries its own specification with it, so sound designers are in full control of the final quality of the sound, and sounds are used consistently throughout the game.


  • In Game Mixing
    : Integrated editor functionality and advanced sound specification tools provide efficient mixing by connecting to a running game instance on various target platforms. This constantly guarantees a well mixed game in every development stage by allowing review of the results in either the game itself, or in other editor modes, such as sounds triggered by animations from within the character editor, for example.

  • Data-driven Sound System
    : Complex sounds can be easily created and delivered with studio quality while supporting any available surround sound speaker configuration. Multi-platform compatibility is guaranteed by FMOD’s included sound library.

  • Interactive Dynamic Music System
    : Improved playback of music tracks by specially defined logic that reacts to any desired game event in order to give the player a movie-like sound track experience.

  • Environmental Audio
    : This feature allows a sound designer to achieve a dense sound impression by accurately reproducing sounds from nature, with seamless blending between different environments, for example the effect of moving from an interior to an exterior location.

  • Dynamic World Sounds
    : Any physical contact can spawn a unique sound controlled by various parameters such as material type, object type, mass, and speed. This technique provides non-repetitive and responsive audio feedback to movement in an interactive game world.


  • Advanced AI System: CryENGINE™ 2 has a flexible and easily customizable AI system which handles both character and vehicle behaviors, it fully supports the complex requirements of the character locomotion system to animate bipedal characters in a believable fashion, and is fully integrated into the CryENGINE Sandbox2 editor.


  • LUA Script Driven AI System
    : Allows complex AI behaviors to be created without requiring new C++ code, including extending state machine behaviors from LUA scripts

  • Dynamic Path Finding
    : Advanced 2D and 3D algorithms allow the AI navigation paths to be modified in real time in response to events which create new or destroy existing paths, a critical feature for creating believable AI in a highly interactive and destructible environment.

  • Smart Objects
    : Provides an easy way for level designers to connect specialized animations to particular objects in the level, so that the character animations and objects are properly aligned at the start and end of the animations, and the correct animation sequence plays.


  • Resource Compiler: Assets are compiled from their original formats to an optimized platform dependent one by the resource compiler at project build time. This allows making global changes (e.g. mipmap computation, mesh stripification) to the output data depending on presets and target platforms without affecting the final level loading times, or requiring developers to keep multiple versions of assets on hand for different platforms.
  • Modular C++ Design: Entirely written in modular C++, fully documented and commented, and divided into logical separate DLL’s, you can use what you need as-is, and modify or replace only the components in our engine that you particularly require to customize for your individual project requirements.
  • Multithreading Support: To get the most out of modern multicore processor architectures, CPU intensive subsystems of CryENGINE™ 2 such as physics, networking and sound, have been re-written to support multi-threading.
  • Performance Analysis: Powerful instrumentation features allows the developer to analyze engine performance in real time, create detailed memory usage reports, and run automated walk-throughs of each level to get consistent test results from build to build
  • Offline Rendering: Creating streaming videos or still images from within the game is made easier by the inclusion of specific console commands which can output a scene at any arbitrary screen resolution and/or aspect ratio, including generating autostitched panoramic views for use on 360 degree projection video displays.
  • Streaming System: Assets can be loaded on demand at run time to allow for larger levels and increased complexity, while reducing the amount of available system RAM required.
  • Network Client and Server System: CryENGINE™ 2 has a totally new, multi-threaded networking system which manages all connections for multiplayer mode. It features a highly reliable, low-latency, low bandwidth system based on client/server architecture using advanced range encoding based compression algorithms.
    right click to download the comlete CryENGINE™ 2 features document in PDF format » here «

Postado

clear.gif

imagens da cry engine 2.0

informacao interessante e que a crytec voltou seus conhecimentos($) agora para o ps3.como pode ser visto no site da crytec ,uma entrevista de 2005 se nao me engano eles ja tinham planos para essa plataforma

http://www.gametrailers.com/player.php?id=18589&type=wmv&pl=game

segundo a descricao as texturas sao frutos do shader unificados(pixel per pixel) os graficos SERIAM TOTALMENTE suportados pelos consoles next-generation ,ja q ambos tem shader unificados

PS.o novo software do forume medonho,tenho q me logar toda hora ,coisa q no outro nunca tive problema,e o peso então?

PS.tres litros de vinho foram nescessario para ter al paciencia

Postado
  sandrovilanova disse:
clear.gif

segundo a descricao as texturas sao frutos do shader unificados(pixel per pixel) os graficos SERIAM TOTALMENTE suportados pelos consoles next-generation ,ja q ambos tem shader unificados

se as texturas são geradas por shaders unificados, o 360 leva uma grande vantagem em relação ao ps3 nas texturas quando utilizadas estas engines, pois tem acho que 50% a mais de shaders unificados que o ps3 na placa, não é?

gostei muito deste tópico, muitas informações sobre as engines que eu ainda desconhecia.

abraços!

Postado

Source engine

From Wikipedia, the free encyclopedia

Jump to: navigation, search

300px-Ep1_citadel_000022.jpg

A Half-Life 2: Episode One scene running on the Source engine, demonstrating , , , facial expressions, realtime cameras and .

300px-Team_Fortress_2_Group_Photo.jpg

In-engine Team Fortress 2 character line-up, demonstrating a cartoon-oriented set of basic shaders, depth of field, facial animation, and dynamic shadowing.

The Source engine is a developed by . Its unique features include a large degree of modularity and flexibility, an artist-driven, shader-based renderer, industry-leading and facial expression technology, and a powerful, efficient and completely network-enabled physics system.

Source supports both and environments and the , , and platforms. It debuted in October 2004 with .

Contents

[]

<LI class=toclevel-1> <LI class=toclevel-2> <LI class=toclevel-2> <LI class=toclevel-3> <LI class=toclevel-3> <LI class=toclevel-3> <LI class=toclevel-3>

<LI class=toclevel-2>



<LI class=toclevel-1> <LI class=toclevel-1>


<LI class=toclevel-1> <LI class=toclevel-1> <LI class=toclevel-1> <LI class=toclevel-1>

[*]

//

Technology overview

300px-Counterstrike-comparison.jpg

The Source engine's vastly improved graphics (on the right) compared to those of its predecessor, GoldSrc (on the left)

For a full overview, see on the Valve Developer Community Wiki. Modularity See Rendering -based with and 3D ; an area can be displayed as a skybox at up to 16x its actual size with full 3D . (see section) Animation Any animation can merge seamlessly with any other animation at any time. to ensure that characters' limbs react to environments Networked physics 300px-Source_melon_drag.jpg

Source's physics simulator is extremely flexible, even online. Here, players turned into melons are roped to a Scanner NPC and try to drag it around the map. See Garry's Mod.

Originate from physics engine Highly-tuned by Valve Processor-efficient Software only Fully networked with low bandwidth requirements Vehicle physics including torque, power, gears, tire material, suspension and mode. Audio Full system Full support Software only Low and high frequency components merged depending on surrounding area and relative position of origin Scalability Supports and upwards Modularity allows for all current and future Source projects to scale back to DX6 if they desire Facial expressions Full range of human and non-human facial movements Based on 's Over eighty-four "digital muscles" Works in tandem with facial expressions Auto-generated, but completely configurable Stored in sound file itself for

Modularity and notable upgrades

Source is designed from the ground up to be highly modular. This allows for the easy upgrade and modification of certain features without breaking other areas of the engine, or breaking engine continuity (that is to say, there need be no 'version jumps' from 1.0 to 2.0). When coupled with , these updates can be distributed retroactively and automatically. For instance, if Source is upgraded to support , every Source title on Steam will instantly benefit. Entirely new features such as High Dynamic Range (HDR) Rendering have been shown to require developer input, however.

High dynamic range rendering

First seen in , then shortly after in , was the first major instance of Source's modularity in use. However, whilst in theory all Source engine games and mods were able to use HDR immediately after its release, the game code required to 'hook in' to the new system was not made available to modders until eleven months later, on August 4 2006. Official licensees and Valve themselves have all made use of the technology since its release.

150px-Facial_anim.jpg

Alyx Vance animated with the Source engine's facial animation system

[] Facial animation 2

When was released on , it introduced the second version of Valve's proprietary system. Ken Birdwell explains the upgrade's features in the game's commentary track:

“When we designed the Half-Life 2 facial system back in 2000, our goal was to get a natural-looking performance at a moderate distance. For Episode One, we wanted to extend the characters' facial systems to support more intense performances with a wider range of facial expressions, that would hold up better at close range. These facial improvements included increasing the detail around the eyes and mouth, increasing the number of facial shape targets – think of these as movements of muscle groups – by about 50%, rewriting the rules that control how these shapes blend, and increasing the intensity of many of our existing shapes.

Postado

[edit] Future technologies

[edit] Dynamic lighting and shadowing 2

A new dynamic lighting and shadow mapping system is being developed for Source, replacing the somewhat limp existing system.[2] [3] It will launch with the various other new Source features with Half-Life 2: Episode Two, expected to be released Q3 2007.

Lighting and Shadowing system comparisonCurrent Lighting and Shadowing SystemDynamic Lighting and Shadowing 2Dynamic shadows in a map always come from the same predetermined direction.Dynamic shadows in a map always react dynamically to every light source. (unconfirmed)Models do not self-shadow or cast shadows onto other objects.Models can self-shadow and cast shadows onto the world and other objects. (unconfirmed)Dynamic shadows do not blend with lightmapped shadows and cast through all objects except world geometry.Dynamic shadows are more unified with static shadows and don't cast through models. (unconfirmed)Every object is allowed only one dynamic shadow.Any object can cast multiple dynamic soft shadows. (unconfirmed)The player's flashlight merely illuminates an area.The player's flashlight casts shadows from models and world geometry.

The new system is a work-in-progress and unconfirmed features are prone to being added and expanded upon; likewise, currently known and seen features are subject to change or removal.

[edit] Next-gen renderer

An upgraded rendering path is in development for future Source engine games on PC, Xbox 360, [4] and presumably PlayStation 3.

[edit] Landscape and Flora Rendering

200px-Hl2ep2_gorge.jpg

An open gorge environment in

Large, open natural environments with heavy foliage, traditionally a weakness for the Source engine, will be supported as of Half-Life 2: Episode Two. The updates will also be available for teams to use shortly after the release.

[] Soft-Particle system

150px-Tf2_pyro_crop.jpg

Team Fortress 2's Pyro, demonstrating real-time lighting, self-shadowing and soft particle technology

During the July 2006 Summer Showcase press conference, mentioned that a new soft-particle system will be introduced into the Source Engine in the upcoming title . It was first demonstrated in the July 19 teaser, which showed a remarkably realistic flamethrower in its closing moments.

[] Cinematic physics

200px-Hl2ep2_cinematicphysicsshack.jpg

Cinematic Physics oversees the destruction of a two-story forest shack

During the July 2006 Electronic Arts Summer Showcase press conference, it was revealed that former Weta Digital employee Gray Horsfield, special effects destruction lead on The Return of the King and King Kong among other roles, is building a "Cinematic Physics" system for Source. GameSpy described the new system in their conference report:

“The idea behind this is to give players the opportunity to experience in-game physics in action on a grander scale. As an example of Cinematic Physics in action, a clip from Half-Life 2: Episode Two was shown of a huge bridge collapsing across a vast ravine.”

The system appears to add the following features to Source's physics simulator:

  • Deforming objects — before, physics models could not be modified except through animation
  • Dynamic crumbling of brush geometry — before, lines of separation had to be specified by the mapper

Cinematic Physics supports a keyframe system, [5] but its exact nature is currently unclear. It could be that an animator creates a largely complete but low-detail sequence which then sees details added by the physics system, or it could be that an animator creates a handful of single-frame states which are then used as motion targets for the ensuing simulation (in a manner not dissimilar to the Endorphin NaturalMotion technology).

Either method results in a drastic reduction of developer input, thus allowing the creation of far more complex scenes than before with the same budget. It is currently unclear both whether or not keyframes are strictly required, and what number are needed to create a scene as complex as the bridge collapse demonstration.

[edit] Multiprocessor optimizations

180px-Source_multicore_critter-avoid-danger.jpg

A hundreds-strong swarm of AI entities avoid danger in a multiprocessor benchmark

As a part of the Source engine's transition to next-generation consoles, multiprocessor optimizations have been added, resulting in faster processing on PC hardware with or systems and the Xbox 360 and PlayStation 3 consoles. Gabe Newell:

“Yes. We definitely think that content needs to move forward. For example, one of the things we're reacting to is the speed at which microprocessors are coming out. So, Intel has very aggressively moved up delivery of desktop processors with four different cores; we'll have support for that in Episode Two, and we'll definitely go back to affect, you know, Episode One or Half-Life 2 or Counter-Strike Source, so they can take advantage of that. We'll definitely try to keep the existing games - especially the multiplayer games - current as technology evolves.

Valve has demonstrated the new multi core optimizations which use a multi-threading style they dub "hybrid threading." A Source multi-threading update and benchmark are expected to be released before , though a date for either and the content of the benchmark are currently unknown.

[] Unconfirmed future technology

[] Cinematic effects

250px-Dod_source_film_effects.jpg

Cinematic effects including depth of field, motion blur and film grain are demonstrated in this video capture

With color correction and film grain already released, [9] Valve intends to add other cinematic effects such as and to Source when hardware is able to render it to their satisfaction. The effects are accomplished with an for quality, creating enormous overhead; for instance, twenty to thirty motion blur frames need to be rendered for every one frame that the user sees. For a constant of thirty frames per second, a video card is required to produce between six hundred and nine hundred frames per second. This causes late 2005-era hardware to require a full two seconds to render each frame.

Motion blur and depth of field can be seen in several of Valve's promotional videos, including:

[] Image-Based Rendering

is a technique in which 2D elements are manipulated to appear in a 3D world. In the context of a 3D game, it delivers a significant performance boost by replacing 3D geometry that is far enough away for the transition to be imperceptible with a 2D image. Implementation of the technology in this role can be found applied to 's soldiers, 's forests, and various objects in ' environments, such as buildings and flora.

The technology had been in development for , as a 2003 interview with shows, but was cut. It was mentioned again by during 's 'Valve week':

“There's this technology that was really exciting that I’d like to see us get into production, which is a different approach to rendering complexity: Moving things into and out of an image domain and then seamlessly interpolating between those motions as you move around. So that everything close to you is physical and geometry, and everything really far away from you is an image, but you have no way of telling that if you do it properly and things can fly out and come back.”

The June 2006 update included an "image-based texture blending shader", and 's expansive environments seem ideally suited to the technology, but it has yet to be dated or even officially announced.

[] File streaming

One of the technologies developed for 's release was file streaming, wherein a map's resources could be loaded as the player moved around in it rather than in one operation before playability. With the system in place, loading times were reduced to as little as fifteen seconds. The system expanded on the caching system already implemented. There is no time frame for its release, as implementing such a system on the potentially infinite variations of PC hardware setups in use poses serious performance problems (see section).

[] Origins

Although Valve has explicitly stated that the Source engine has been built internally from the ground up, rumors and myths persist that it is instead merely derived from the original codebase via Valve's offshoot. The primary reasons for this are the manner in which the engine uses similar development interfaces to GoldSrc (to aid transitioning developers), and 's comment on his that "there are still bits of early Quake code in Half-Life 2", expanded through hearsay to be a confirmation that large swathes of code are identical, when no such conclusion can be drawn from the statement. There remains no solid proof that Source is derived from the GoldSrc codebase — and indeed, given the fact that the did not produce any such claims it can only be assumed that no incriminating evidence was to be found.

However, it is known that Source was developed part-by-part, slowly replacing the GoldSrc engine in Valve's internal projects. This explains its modular nature, and suggests that, even if Source was not derived from GoldSrc, GoldSrc was at the very least modified to plug into it during development.

[] Common issues

[] Stutter

The Source Engine uses a system, whereby the loading of certain resources is handled and managed on the fly, rather than in a single operation behind a traditional loading screen. Texture and sound data are the primary areas in which this occurs. Textures are loaded to but only moved to the system's when needed and audio files are loaded with an unusual "soundcache" system: only the first 0.125 seconds of each file are pre-cached, and the clip is used to cover the buffering of the full sound file in the background when it is first requested.

Both systems keep data in the until there is no more room and older resources are flushed out, and when either is held up or otherwise slowed down the engine will either freeze or go into a temporary loop until the data arrives. 'Stuttering', or 'hitching' as it is sometimes known, is the result of these pauses.

While stutter can be caused by poor system performance, it has also been noted on hardware setups that should be more than powerful enough to cope with the data rate, and despite many theories, the precise cause remains unknown to the public even over two years since the engine's debut. Most solutions that have been found involve bypassing the caching system, as it cannot be directly disabled, or system-specific optimizations (e.g. updates).

When Half-Life 2 was first released and stuttering became a widely-known problem, community member Mark McWilliams set up a page covering the issue and Valve's communication and work on resolving it. Several updates were released by Valve, the effects of which varied from complete fixes for some users to previously smooth systems becoming "infected" with the problem.

Most recently (, ), changes to the Source engine were introduced alongside a of Steam's chat service, with the aim of 'narrowing down' the problem. The update featured a limited implementation of Source's file streaming system (see above). Generally, the response was very positive.

[] Looping audio

The Source Engine suffers from an error whereby the asynchronous loading (see Stutter, above) of a new sound file will cause the engine to lock up with looping audio. Because of the nature of , once the engine enters such a state it will remain on the screen unless the user can blindly terminate the program, or reboots their computer. The error occurs in a standard Windows associated with on-board , and in some cases can be resolved by decreasing Acceleration.

Reports of looping audio crashes increased around the release of Episode One. While it is likely that the spike was simply because there was an unusually high number of people playing the game, changes to the engine, of which there were many for the game, cannot be ruled out. It has been noted that people who have been able to play Half-Life 2 without any crashes and audio errors in fact do find Episode 1 to be more prone to that problem. A steampowered.com forum thread is dedicated to discussing the problem and attempting to work out solutions, although the experience of the thread contributors indicates that most suggested workarounds and fixes do not remedy the problem.

[] Valve Developer Community

On 28 June 2005, Valve opened the Wiki. The VDC replaced the previously available with a full -powered community site. Within a matter of days Valve reported that "the number of useful articles [had] nearly doubled". These new articles covered the previously undocumented bot (added by the bot's author, ) and Half-Life 2 AI, for Source engine mods, and more.

Arquivado

Este tópico foi arquivado e está fechado para novas respostas.

Sobre o Clube do Hardware

No ar desde 1996, o Clube do Hardware é uma das maiores, mais antigas e mais respeitadas comunidades sobre tecnologia do Brasil. Leia mais

Mostrar mais  

Direitos autorais

Não permitimos a cópia ou reprodução do conteúdo do nosso site, fórum, newsletters e redes sociais, mesmo citando-se a fonte. Leia mais

Mostrar mais  
×
×
  • Criar novo...