Search Results

Search found 2787 results on 112 pages for 'triple channel'.

Page 34/112 | < Previous Page | 30 31 32 33 34 35 36 37 38 39 40 41  | Next Page >

  • XNA hlsl tex2D() only reads 3 channels from normal maps and specular maps

    - by cubrman
    Our engine uses deferred rendering and at the main draw phase gathers plenty of data from the objects it draws. In order to save on tex2D calls, we packed our objects' specular maps with all sorts of data, so three out of four channels are already taken. To make it clear: I am talking about the assets that come with the models and are stored in their material's Specular Level channel, not about the RenderTarget. So now I need another information to be stored in the alpha channel, but I cannot make the shader to read it properly! Nomatter what I write into alpha it ends up being 1 (255)! I tried: saving the textures in PNG/TGA formats. turning off pre-computed alpha in model's properties. Out of every texture available to me (we use Diffuse map, Normal Map and Specular Map) I was only able to read alpha successfully from the Diffuse Map! Here is how I add specular and normal maps to my model's material in the content processor: if (geometry.Material.Textures.ContainsKey(normalMapKey)) { ExternalReference<TextureContent> texRef = geometry.Material.Textures[normalMapKey]; geometry.Material.Textures.Remove("NormalMap"); geometry.Material.Textures.Add("NormalMap", texRef); } ... foreach (KeyValuePair<String, ExternalReference<TextureContent>> texture in material.Textures) { if ((texture.Key == "Texture") || (texture.Key == "NormalMap") || (texture.Key == "SpecularMap")) mat.Textures.Add(texture.Key, texture.Value); } In the shader I obviously use: float4 data = tex2D(specularMapSampler, TexCoords); so data.a is always 1 in my case, could you suggest a reason?

    Read the article

  • Leading Analyst Firm Positions Oracle in Leaders Quadrant for Web Content Management

    - by Christie Flanagan
    Gartner, Inc. has named Oracle a Leader in its latest “Magic Quadrant for Web Content Management.” Gartner’s Magic Quadrants position vendors within a particular quadrant based on their completeness of vision and their ability to execute on that vision. According to Gartner, “WCM plays an increasingly important role in business performance. It has become the central point of coordination for initiatives involving the enterprise's online presence, and these initiatives have become more sophisticated and more important to enterprises' business strategies. Thus, WCM is key for organizations wishing to execute a strategy of OCO (online channel optimization) that embraces areas such as customer experience management, e-commerce, digital marketing, multichannel marketing and website consolidation.” Gartner continued, “Leaders should drive market transformation. Leaders have the highest combined scores for Ability to Execute and Completeness of Vision. They are doing well and are prepared for the future with a clear vision and a thorough appreciation of the broader context of OCO. They have strong channel partners, a presence in multiple regions, consistent financial performance, broad platform support and good customer support. In addition, they dominate in one or more technologies or vertical markets. Leaders are aware of the ecosystem in which their offerings need to fit. Leaders can: demonstrate enterprise deployments’ offer integration with other business applications and content repositories; provide a vertical-process or horizontal-solution focus.” Oracle WebCenter, the engagement platform powering exceptional experiences for customers, employees and partners, connects people and information by bringing together the most complete portfolio of portal, Web experience management, content, social, and collaboration technologies into a single integrated product suite. Oracle WebCenter also provides the foundation for Oracle Fusion Middleware and Oracle Fusion Applications to deliver a next-generation user experience.  To see the latest reports, webcasts and demonstrations about Oracle's web experience management solution, Oracle WebCenter Sites, please visit our Connected Customer Experience Resource Center.

    Read the article

  • Flash Player for IPTV

    - by Le Mystique
    Hi All, We have to design a flash player for three video sources of IPTV. Educational videos on the system that can be played on demand. (VOD or Video on Demand local) Streaming videos available on different websites that can be played on demand (VOD or Video on Demand remote) IPTV live streaming video feed Videos from three sources have to be played within single flash player that we need to design. There are 2 basic requirements: For the two VOD types, we have to provide the pause and play functionality. For IPTV, pause and play is not required, However, channel switching time is critically important here. The user should get the program behaviour as close to TV experience as possible. This means channel switching should be fast. To provide this, buffering of video is required for multiple channels. Now the questions is, is this possible? If yes, how? Secondly, what skill set (e.g. action script 3.0 , flash 10) should the flash programmer have to be able to design this? Thirdly is it as simple as making a normal FLV player? Or is it a more complex job?

    Read the article

  • JavaOne Content on Video

    - by Tori Wieldt
    JavaOne content is available in video in three sizes, depending on if you want to have a sip, have a drink, or go to the proverbial firehose. Tall (Keynote Highlights) Go to the JavaOne playlist on the YouTube Java channel for highlights of the JavaOne Keynotes.  Grande (Keynotes in Full) Go to the Oracle Media Network JavaOne 2012 channel to view the keynotes in full (Community Keynote coming soon). Venti (All Sessions, BOFs and Tutorials) To view slides paired with audio of each session, go to the JavaOne content catalog (JavaOne homepage > Programs > Content Catalog) and select a session. If a video is available, you'll see "Media" in the right column. Look under "Presentation Download" to get the slides. Sessions are being made available as quickly as possible. "It's exciting to see Oracle take community stewardship so seriously," said Sharat Chander, Group Director for Java Technology Outreach. "Making all JavaOne sessions on video available online for free will helps make the future Java for everyone."

    Read the article

  • USB flash module giving errors

    - by vshenoy
    Hi, I have a SATA USB flash module which was earlier running a 2.4 linux kernel (2.4.36.6) and on which now I am trying to install ubuntu server 10.04.1 LTS. I have two such USB flash modules and on one of them the installation process itself giving these errors: sd 4:0:0:0 [sda] Device not ready sd 4:0:0:0 [sda] Result: hostbyte=DID_OK driverbyte=DRIVER_SENSE sd 4:0:0:0 [sda] Sense Key : Not Ready [current] sd 4:0:0:0 [sda] Add. Sense: Medium not present sd 4:0:0:0 [sda] CDB: Write(10): 2a 00 00 05 48 02 00 00 04 00 end_request: I/O error, dev sda, sector 46114 usb 1-1: reset high speed USB device using ehci_hcd and address 2 Buffer I/O error on device sda1, logical block 172033 lost page write due to I/O error on sda1 Buffer I/O error on device sda1, logical block 172034 lost page write due to I/O error on sda1 on the other the installation is successful, but after a day or two of running the machine hangs because of kernel spewing these messages: Remounting filesystem read-only EXT2-fs error (device sda1): read_block_bitmap: Cannot read block [bitmap - block_group = 105, block_bitmap = 860161] EXT2-fs error (device sda1): ext2_get_inode: unable to read inode block - inode=13083, block=24683 ext2_free_inode: bit already cleared for inode 83966 and the machine needs to be hard rebooted. On both the systems SCSI emulation with usb_storage driver is being used to detect the module. Here is the output of /proc/scsi/scsi on 2.4: # cat /proc/scsi/scsi Attached devices: Host: scsi0 Channel: 00 Id: 00 Lun: 00 Vendor: TS Model: UFM Rev: 1100 Type: Direct-Access ANSI SCSI revision: 02 and on 2.6: # cat /proc/scsi/scsi Attached devices: Host: scsi6 Channel: 00 Id: 00 Lun: 00 Vendor: TS Model: UFM Rev: 1100 Type: Direct-Access ANSI SCSI revision: 00 i.e. only 'ANSI SCSI revision:' is shown as different, although I am not sure if this can cause any problem. Really appreciate if someone can point as to how to debug this issue or any mailing list where I can further ask questions about this.

    Read the article

  • How to connect to my own WiFi using Broadcom STA drivers?

    - by Chris
    I'm trying hard to switch to Linux from Windows because of my engineering project. Unfortunately, everything is against that change! Before I have installed Broadcom STA proprietary drivers, I was seeing on NetworkManager and nm-applet only local radio-internet-access networks. After I installed Broadcom STA, I see my neighbor's wireless network (channel 11, WEP) Neither before nor after the installation is own wireless network available. Computer: Asus Lamborghini VX6 Ubuntu: 12.04 LTS 64-bit Router: ASUS N55U (A1) with newest AsusWRT firmware Network: Channel 5 (tried also 10 and 11, both on 20 and 40MHz bands), WPA2 Personal, 2,4 + 5 GHz (what is not very important, 'cause the wlan card in VX6 is only 2,4GHz). Network works fine on Windows, also through D-Link repeater on the other floor. Unfortunately, same network is invisible to Ubuntu on same machine. I have tried some combinations with other GUIs but it did not work. Are there any better drivers for Ubuntu? I need that network badly, but I'm an Ubuntu newbie, so I don't know how to solve that problem. Please help.

    Read the article

  • Functional Programming, JavaScript and UI - some neophyte questions

    - by jamesson
    This has been discussed in other threads, however I am hoping for some comments relevant to UI and an explanation of some vitriol I had flung my way in a Certain IRC Channel Which shall remain nameless. In the discussion here, the comments in the accepted answer suggest that I approach the given code from a functional perspective, which was new to me at the time. Wikipedia said, among other things, that FP "avoids state and mutable data", which includes according to the discussion global vars. Now, being that I am already pretty far along in my project I am not going to learn FP before I finish, but... How is it possible to avoid global vars if, for instance, I have a UI whose entire functionality changes if a mousebutton is down? I have a number of things like this. Why was there a strong negative reaction in the Certain IRC channel to implementing FP in JS? When I Brought up what seemed to me to be supportive comments by Crockford, people got even madder. Now, this being IRC there is no rep system, but they at least gave indication of having read TGP (which I haven't gotten to yet) so I'm assuming they're not idiots. Many thanks in advance Joe

    Read the article

  • JavaOne Content Available for Free

    - by Tori Wieldt
    JavaOne content is available in video in three sizes, depending on if you want to have a sip, have a drink, or go to the proverbial firehose. Tall (Keynote Highlights) Go to the JavaOne playlist on the YouTube Java channel for highlights of the JavaOne Keynotes.  Grande (Keynotes in Full) Go to the Oracle Media Network JavaOne 2012 channel to view the keynotes in full (Community Keynote coming soon). Venti (All Sessions, BOFs and Tutorials) To view slides paired with audio of each session, go to the JavaOne content catalog (JavaOne homepage, click on JavaOne Technical Sessions) and select a session. If a video is available, you'll see "Media" in the right column. Look under "Presentation Download" to get the slides. Sessions are being made available as quickly as possible. "It's exciting to see Oracle take community stewardship so seriously," said Sharat Chander, Group Director for Java Technology Outreach. "Making all JavaOne sessions on video available online for free will helps make the future Java for everyone." Thanks to Oracle for funding this and providing to it to the Java Community for free.

    Read the article

  • Can you recommend a game server for a facebook board game?

    - by Yekmer Simsek
    I am seeking a game server that will scale well. All commercial and/or free software alternatives are welcome. Game will be a boardgame that is similar to poker. Some technical details are listed below. There will be a table which consists of 4 people, to send them message I need a channel manager. A table will be ready to play for at least 5 minutes. There should be a reliable channel manager. People will wait for some time(i.e.) and if they are not playing they will be kicked by server, so there will be a reliable timed task queue to execute some tasks. It should be quick enough to response and show the changes to all 4 people on that table simultaneously.To achive this server should have a powerfull I/O library. I think to use inmemory to have quick response times, but it comes with scalability problems. And some variables should be thread safe so a variable should be thread safe between multiple nodes. Flash(AS3) and Unity (.Net 2.0 C# mono) client API's should be available for socket connection. PS: I am using Reddwarf server, it lacks of documentation and multiple node.

    Read the article

  • JavaOne SF??????????????!&????????

    - by OTN-J Master
    JavaOne 2012 San Francisco????????????????????????!??????????????????????????????????????????????????Java??????????????????????????? ??????????? JavaOne??????????????????YouTube?Java Channel?”JavaOne playlist”??????? ????? ???? ????????????????Oracle Media Network?”JavaOne 2012 channel”???????(??????·??????????????) ????????·?????/BOF/???????? ??????????????????????????JavaOne????????JavaOne Content Catalog??????????????????????????(???)??????????????????”Media”?????????????????????????????????????????????????? ??? ???????? ??? JavaOne???????????????????Oracle Days????????? ? ?????????????? ????????????????????????????????? 2012? Java????????? ??: 10?31?(?) 15:00-15:45??: ??????????? ????????1-4-1(????????????) ??:2012 ? 4 ? ????7?????? JavaOne ????? Java ????????????????????????????????? ???JavaOne 2012 ?????? Java ?????????????????????????????????????????? Java ??????????????????????????? ??: ??????????Fusion Middleware???? ????????????? Java?????????? ??  

    Read the article

  • Help trying to get two-finger scrolling to work on Asus UL80VT

    - by Dan2k3k4
    Multi-touch works fine on Windows 7 with: two-fingers scroll vertical and horizontally, two-finger tap for middle click, and three-finger tap for right click. However with Ubuntu, I've never been able to get multi-touch to "save" and work, I was able to get it to work a few times but after restarting - it would just reset back. I have the settings for two-finger scrolling on: Mouse and Touchpad Touchpad Two-finger scrolling (selected) Enable horizontal scrolling (ticked) The cursor stops moving when I try to scroll with two fingers, but it doesn't actually scroll the page. When I perform xinput list, I get: Virtual core pointer id=2 [master pointer (3)] ? Virtual core XTEST pointer id=4 [slave pointer (2)] ? ETPS/2 Elantech ETF0401 id=13 [slave pointer (2)] I've tried to install some 'synaptics-dkms' bug-fix (from a few years back) but that didn't work, so I removed that. I've tried installing 'uTouch' but that didn't seem to do anything so removed it. Here's what I have installed now: dpkg --get-selections installed-software grep 'touch\|mouse\|track\|synapt' installed-software libsoundtouch0 --- install libutouch-evemu1 --- install libutouch-frame1 --- install libutouch-geis1 --- install libutouch-grail1 --- install printer-driver-ptouch --- install ptouch-driver --- install xserver-xorg-input-multitouch --- install xserver-xorg-input-mouse --- install xserver-xorg-input-vmmouse --- install libnetfilter-conntrack3 --- install libxatracker1 --- install xserver-xorg-input-synaptics --- install So, I'll start again, what should I do now to get two-finger scrolling to work and ensure it works after restarting? Also doing: synclient TapButton1=1 TapButton2=2 TapButton3=3 ...works but doesn't save after restarting. However doing: synclient VertTwoFingerScroll=1 HorizTwoFingerScroll=1 Does NOT work to fix the two-finger scrolling. Output of: cat /var/log/Xorg.0.log | grep -i synaptics [ 4.576] (II) LoadModule: "synaptics" [ 4.577] (II) Loading /usr/lib/xorg/modules/input/synaptics_drv.so [ 4.577] (II) Module synaptics: vendor="X.Org Foundation" [ 4.577] (II) Using input driver 'synaptics' for 'ETPS/2 Elantech ETF0401' [ 4.577] (II) Loading /usr/lib/xorg/modules/input/synaptics_drv.so [ 4.584] (--) synaptics: ETPS/2 Elantech ETF0401: x-axis range 0 - 1088 [ 4.584] (--) synaptics: ETPS/2 Elantech ETF0401: y-axis range 0 - 704 [ 4.584] (--) synaptics: ETPS/2 Elantech ETF0401: pressure range 0 - 255 [ 4.584] (--) synaptics: ETPS/2 Elantech ETF0401: finger width range 0 - 16 [ 4.584] (--) synaptics: ETPS/2 Elantech ETF0401: buttons: left right middle double triple scroll-buttons [ 4.584] (--) synaptics: ETPS/2 Elantech ETF0401: Vendor 0x2 Product 0xe [ 4.584] (--) synaptics: ETPS/2 Elantech ETF0401: touchpad found [ 4.588] (**) synaptics: ETPS/2 Elantech ETF0401: (accel) MinSpeed is now constant deceleration 2.5 [ 4.588] (**) synaptics: ETPS/2 Elantech ETF0401: MaxSpeed is now 1.75 [ 4.588] (**) synaptics: ETPS/2 Elantech ETF0401: AccelFactor is now 0.154 [ 4.589] (--) synaptics: ETPS/2 Elantech ETF0401: touchpad found Tried installing synaptiks but that didn't seem to work either, so removed it. Temporary Fix (works until I restart) Doing the following commands: modprobe -r psmouse modprobe psmouse proto=imps Works but now xinput list shows up as: Virtual core pointer id=2 [master pointer (3)] ? Virtual core XTEST pointer id=4 [slave pointer (2)] ? ImPS/2 Generic Wheel Mouse id=13 [slave pointer (2)] Instead of Elantech, and it gets reset when I reboot. Solution (not ideal for most people) So, I ended up reinstalling a fresh 12.04 after indirectly playing around with burg and plymouth then removing plymouth which removed 50+ packages (I saw the warnings but was way too tired and assumed I could just 'reinstall' them all after (except that didn't work). Right now xinput list shows up as: ? Virtual core pointer --- id=2 [master pointer (3)] ? ? Virtual core XTEST pointer --- id=4 [slave pointer (2)] ? ? ETPS/2 Elantech Touchpad --- id=13 [slave pointer (2)] grep 'touch\|mouse\|track\|synapt' installed-software libnetfilter-conntrack3 --- install libsoundtouch0 --- install libutouch-evemu1 --- install libutouch-frame1 --- install libutouch-geis1 --- install libutouch-grail1 --- install libxatracker1 --- install mousetweaks --- install printer-driver-ptouch --- install xserver-xorg-input-mouse --- install xserver-xorg-input-synaptics --- install xserver-xorg-input-vmmouse --- install cat /var/log/Xorg.0.log | grep -i synaptics [ 4.890] (II) LoadModule: "synaptics" [ 4.891] (II) Loading /usr/lib/xorg/modules/input/synaptics_drv.so [ 4.892] (II) Module synaptics: vendor="X.Org Foundation" [ 4.892] (II) Using input driver 'synaptics' for 'ETPS/2 Elantech Touchpad' [ 4.892] (II) Loading /usr/lib/xorg/modules/input/synaptics_drv.so [ 4.956] (II) synaptics: ETPS/2 Elantech Touchpad: ignoring touch events for semi-multitouch device [ 4.956] (--) synaptics: ETPS/2 Elantech Touchpad: x-axis range 0 - 1088 [ 4.956] (--) synaptics: ETPS/2 Elantech Touchpad: y-axis range 0 - 704 [ 4.956] (--) synaptics: ETPS/2 Elantech Touchpad: pressure range 0 - 255 [ 4.956] (--) synaptics: ETPS/2 Elantech Touchpad: finger width range 0 - 15 [ 4.956] (--) synaptics: ETPS/2 Elantech Touchpad: buttons: left right double triple [ 4.956] (--) synaptics: ETPS/2 Elantech Touchpad: Vendor 0x2 Product 0xe [ 4.956] (--) synaptics: ETPS/2 Elantech Touchpad: touchpad found [ 4.980] () synaptics: ETPS/2 Elantech Touchpad: (accel) MinSpeed is now constant deceleration 2.5 [ 4.980] () synaptics: ETPS/2 Elantech Touchpad: MaxSpeed is now 1.75 [ 4.980] (**) synaptics: ETPS/2 Elantech Touchpad: AccelFactor is now 0.154 [ 4.980] (--) synaptics: ETPS/2 Elantech Touchpad: touchpad found So, if all else fails, reinstall Linux :/

    Read the article

  • Silverlight Cream for June 16, 2011 -- #1108

    - by Dave Campbell
    In this Issue: René Schulte, Rajat Jaiswal(-2-), Peter Kuhn, Colin Eberhardt, Kunal Chowdhury(-2-), Beth Massi, Michael Crump, Daniel Vaughan, Chris Rouw, WindowsPhoneGeek, and Jesse Liberty. Above the Fold: Silverlight: "Cubelicious - Silverlight 5 + Balder + Physics + SLARToolkit Augmented Reality = Triple Win!" René Schulte WP7: "Binding the WP7 ProgressIndicator in XAML" Daniel Vaughan LightSwitch: "Adding Static Images and Text on a LightSwitch Screen" Beth Massi Shoutouts: Laurent Bugnion is Proposing a new RelayCommand snippet for MVVM Light V4... read about it and give him some feedback From SilverlightCream.com: Cubelicious - Silverlight 5 + Balder + Physics + SLARToolkit Augmented Reality = Triple Win! René Schulte has a post up about using the SLARToolkit for Silverlight 5 Beta in conjuncion with Balder and Physics ... dang this is cool, check out the video! PSD TO XAML in few easy steps using Expression Blend I'm not a Photoshop person, but apparently Rajat Jaiswal is, and he's demonstrating using Expression Blend to get your PSD file into XAML Its really great feature Silverlight realtime augment toolkit This is a fun post from Rajat Jaiswal... fun to see someone other than René Schulteposting about René's SLARToolkit :) Getting ready for the Windows Phone 7 Exam 70-599 (Part 2) Peter Kuhn has part 2 of his series up on getting ready for the Windows Phone 7 Exam at SilverlightShow Metro In Motion Part #7 – Panorama Prettiness and Opacity Colin Eberhardt has another Metro in Motion up... this one concentrates on the opacity effect when the user slides from item-to-item in Panorama contents Windows Phone 7 (Mango) Tutorial - 13 - What is Tombstoning? Kunal Chowdhury has a couple of posts up... first up is this one on Tombstoning... and if you're just starting with WP7.1, it got easier Windows Phone 7 Tip: Showing and Hiding onscreen keyboard in Emulator Kunal Chowdhury's latest is a great hint if you haven't found it already... how to show/hide the onscreen keyboard in the emulator Adding Static Images and Text on a LightSwitch Screen Beth Massi's latest post is on showing how to display an image or static text such as a logo in a LightSwitch app Displaying PDF Files in Windows Phone 7 Mango Michael Crump responds to reader's questions about displaying a PDF file in WP7.1 with this post using ComponentOne's Studio for Windows Phone CTP Binding the WP7 ProgressIndicator in XAML Daniel Vaughan has a solution to the problem of having to bind the ProgressIndicator in WP7.1 in code-behind... he wrote a ProgressIndicatorProxy and shares it with us!<>/dd> Storing Files in SQL Server using WCF RIA Services and Silverlight – Part 2 Chris Rouw has Part 2 of his Storing Files in SQL Servier using WCF RIA Services and Silverlight up... this one is on uploading and saving files to the database from Silvelright by the user dropping them onto your app. Using SqlMetal to generate Windows Phone Mango Local Database classes OK I'm not too proud to admit I'd never heard of SQLMetal... if you haven't, or even if you have, this post by WindowsPhoneGeek is a good discussion of using it to generate your WP7.1 database classes. Obtaining Email, Address or Phone Number Jesse Liberty's latest is another in his 'Mango From Scratch' series discussing the new tasks to obtain more info from the contact list. Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • Problems with 3D Array for Voxel Data

    - by Sean M.
    I'm trying to implement a voxel engine in C++ using OpenGL, and I've been working on the rendering of the world. In order to render, I have a 3D array of uint16's that hold that id of the block at the point. I also have a 3D array of uint8's that I am using to store the visibility data for that point, where each bit represents if a face is visible. I have it so the blocks render and all of the proper faces are hidden if needed, but all of the blocks are offset by a power of 2 from where they are stored in the array. So the block at [0][0][0] is rendered at (0, 0, 0), and the block at 11 is rendered at (1, 1, 1), but the block at [2][2][2] is rendered at (4, 4, 4) and the block at [3][3][3] is rendered at (8, 8, 8), and so on and so forth. This is the result of drawing the above situation: I'm still a little new to the more advanced concepts of C++, like triple pointers, which I'm using for the 3D array, so I think the error is somewhere in there. This is the code for creating the arrays: uint16*** _blockData; //Contains a 3D array of uint16s that are the ids of the blocks in the region uint8*** _visibilityData; //Contains a 3D array of bytes that hold the visibility data for the faces //Allocate memory for the world data _blockData = new uint16**[REGION_DIM]; for (int i = 0; i < REGION_DIM; i++) { _blockData[i] = new uint16*[REGION_DIM]; for (int j = 0; j < REGION_DIM; j++) _blockData[i][j] = new uint16[REGION_DIM]; } //Allocate memory for the visibility _visibilityData = new uint8**[REGION_DIM]; for (int i = 0; i < REGION_DIM; i++) { _visibilityData[i] = new uint8*[REGION_DIM]; for (int j = 0; j < REGION_DIM; j++) _visibilityData[i][j] = new uint8[REGION_DIM]; } Here is the code used to create the block mesh for the region: //Check if the positive x face is visible, this happens for every face //Block::VERT_X_POS is just an array of non-transformed cube verts for one face //These checks are in a triple loop, which goes over every place in the array if (_visibilityData[x][y][z] & 0x01 > 0) { _vertexData->AddData(&(translateVertices(Block::VERT_X_POS, x, y, z)[0]), sizeof(Block::VERT_X_POS)); } //This is a seperate method, not in the loop glm::vec3* translateVertices(const glm::vec3 data[], uint16 x, uint16 y, uint16 z) { glm::vec3* copy = new glm::vec3[6]; memcpy(&copy, &data, sizeof(data)); for(int i = 0; i < 6; i++) copy[i] += glm::vec3(x, -y, z); //Make +y go down instead return copy; } I cannot see where the blocks may be getting offset by more than they should be, and certainly not why the offsets are a power of 2. Any help is greatly appreciated. Thanks.

    Read the article

  • Best way to create a SPARQL endpoint for a RDBMS (MySQL database)

    - by Ankur
    I am doing (want to do) some experiments with Linked Open Datasets particularly those put out by governments. I have a RDBMS (more specifically MySQL). I designed it with semantic web ideas in mind i.e. I have a information stored as objects, predicates and classes which define objects. In turn all objects are related to each other though statements of the form subject -- predicate -- object (where the subjects are from the objects table). I want to be able to query other RDF triple stores from my application and let other triple stores query my data. Is it possible to "set something up" so that this is possible? I have looked at Jena. Using Jena seems to mean I have to it as a storage application rather than MySQL - the only problem with this is that I include a new concept called a category (which I don't think is part of the semantic web languages). I will use categories to help with displaying information (they don't have any other meaning) but using Jena seems to mean that I can't organise predicates under categories for more convenient viewing. I am using Java so a JAVA API is preferred. It's also possible I misunderstood the purpose of Jena, and maybe that can be of use, but I am not sure how. I am sure four days from now this question will seem rather silly, but at the moment I am somewhat confused about how to proceed.

    Read the article

  • Could I do this blind relative to absolute path conversion (for perforce depot paths) better?

    - by wonderfulthunk
    I need to "blindly" (i.e. without access to the filesystem, in this case the source control server) convert some relative paths to absolute paths. So I'm playing with dotdots and indices. For those that are curious I have a log file produced by someone else's tool that sometimes outputs relative paths, and for performance reasons I don't want to access the source control server where the paths are located to check if they're valid and more easily convert them to their absolute path equivalents. I've gone through a number of (probably foolish) iterations trying to get it to work - mostly a few variations of iterating over the array of folders and trying delete_at(index) and delete_at(index-1) but my index kept incrementing while I was deleting elements of the array out from under myself, which didn't work for cases with multiple dotdots. Any tips on improving it in general or specifically the lack of non-consecutive dotdot support would be welcome. Currently this is working with my limited examples, but I think it could be improved. It can't handle non-consecutive '..' directories, and I am probably doing a lot of wasteful (and error-prone) things that I probably don't need to do because I'm a bit of a hack. I've found a lot of examples of converting other types of relative paths using other languages, but none of them seemed to fit my situation. These are my example paths that I need to convert, from: //depot/foo/../bar/single.c //depot/foo/docs/../../other/double.c //depot/foo/usr/bin/../../../else/more/triple.c to: //depot/bar/single.c //depot/other/double.c //depot/else/more/triple.c And my script: begin paths = File.open(ARGV[0]).readlines puts(paths) new_paths = Array.new paths.each { |path| folders = path.split('/') if ( folders.include?('..') ) num_dotdots = 0 first_dotdot = folders.index('..') last_dotdot = folders.rindex('..') folders.each { |item| if ( item == '..' ) num_dotdots += 1 end } if ( first_dotdot and ( num_dotdots > 0 ) ) # this might be redundant? folders.slice!(first_dotdot - num_dotdots..last_dotdot) # dependent on consecutive dotdots only end end folders.map! { |elem| if ( elem !~ /\n/ ) elem = elem + '/' else elem = elem end } new_paths << folders.to_s } puts(new_paths) end

    Read the article

  • How to use boost::fusion::transform on heterogeneous containers?

    - by Kyle
    Boost.org's example given for fusion::transform is as follows: struct triple { typedef int result_type; int operator()(int t) const { return t * 3; }; }; // ... assert(transform(make_vector(1,2,3), triple()) == make_vector(3,6,9)); Yet I'm not "getting it." The vector in their example contains elements all of the same type, but a major point of using fusion is containers of heterogeneous types. What if they had used make_vector(1, 'a', "howdy") instead? int operator()(int t) would need to become template<typename T> T& operator()(T& const t) But how would I write the result_type? template<typename T> typedef T& result_type certainly isn't valid syntax, and it wouldn't make sense even if it was, because it's not tied to the function.

    Read the article

  • Transformation of Product Management in Telecommunications for Rapid Launch of Next Generation Products

    - by raul.goycoolea
    @font-face { font-family: "Arial"; }@font-face { font-family: "Courier New"; }@font-face { font-family: "Wingdings"; }@font-face { font-family: "Cambria"; }p.MsoNormal, li.MsoNormal, div.MsoNormal { margin: 0cm 0cm 0.0001pt; font-size: 12pt; font-family: "Times New Roman"; }a:link, span.MsoHyperlink { color: blue; text-decoration: underline; }a:visited, span.MsoHyperlinkFollowed { color: purple; text-decoration: underline; }p.MsoListParagraph, li.MsoListParagraph, div.MsoListParagraph { margin: 0cm 0cm 0.0001pt 36pt; font-size: 12pt; font-family: "Times New Roman"; }p.MsoListParagraphCxSpFirst, li.MsoListParagraphCxSpFirst, div.MsoListParagraphCxSpFirst { margin: 0cm 0cm 0.0001pt 36pt; font-size: 12pt; font-family: "Times New Roman"; }p.MsoListParagraphCxSpMiddle, li.MsoListParagraphCxSpMiddle, div.MsoListParagraphCxSpMiddle { margin: 0cm 0cm 0.0001pt 36pt; font-size: 12pt; font-family: "Times New Roman"; }p.MsoListParagraphCxSpLast, li.MsoListParagraphCxSpLast, div.MsoListParagraphCxSpLast { margin: 0cm 0cm 0.0001pt 36pt; font-size: 12pt; font-family: "Times New Roman"; }div.Section1 { page: Section1; }ol { margin-bottom: 0cm; }ul { margin-bottom: 0cm; } The Telecom industry continues to evolve through disruptive products, uncertain markets, shorter product lifecycles and convergence of technologies. Today’s market has moved from network centric to consumer centric and focuses primarily on the customer experience. It has resulted in several product management challenges such as an increased complexity and volume of offerings, creating product variants, accelerating time-to-market, ability to provide multiple product views for varied stakeholders, leveraging OSS intelligence to BSS layer, product co-creation and increasing audit and security concerns for service providers. The document discusses how enterprise product management enabled by PLM-based product catalogue solutions helps to launch next generation products rapidly in the context of the Telecommunication Industry.   1.0.       Introduction   Figure 1: Business Scenario   Modern business demands the launch of complex products in a very short timeframe and effecting changes in the price plan faster without IT intervention. One of the key transformation initiatives companies are focusing on is in the area of product management transformation and operational efficiency improvement. As part of these initiatives, companies are investing in best- in-class COTs-based Product Management solutions developed on industry-wide standards.   The new COTs packages are planned to integrate with existing or new B/OSS systems to provide a strategic end-to-end agile solution for reduced time-to-market and order journey time. In addition, system rationalization is being undertaken to phase out legacy systems and migrate to strategic systems.   2.0.       An Overview of Product Management in Telecom   Product data in telecom is multi- dimensional and difficult to manage. It increased significantly due to the complexity of the product, product offerings on the converged network, increased volume of offerings, bundled offering structures and ever increasing regulatory requirements.   In addition, the shrinking product lifecycle in telecom makes it difficult to manage the dynamic product data. Mergers and acquisitions coupled with organic growth pose major challenges in product portfolio management. It is a roadblock in the journey towards becoming an agile organization.       Figure 2: Complexity in Product Management   Network Technology’ is the new dimension in telecom product management where the same products are realized through different networks i.e., Soiled network to Converged network. Consequently, the product solution is different.     Figure 3: Current Scenario - Pain Points in Product Management   The major business implications arising out of the current scenario are slow time-to-market and an inefficient process that affects innovation.   3.0. Transformation of Next Generation Product Management   Companies must focus on their Product Management Transformation Journey in the areas of:   ·       Management of single truth of product information across the organization/geographies which is currently managed in heterogeneous systems   ·       Management of the Intellectual Property (IP) on the product concept and partnership in the design of discrete components to integrate into the system   ·       Leveraging structured and unstructured product data within the extended enterprise to extract consumer insights and drive innovation   ·       Management of effective operational separation to comply with regulatory bodies   ·       Reuse of existing designs and add relevant features such as value-added services to enable effective product bundling     Figure 4: Next generation needs   PLM-based Enterprise Product Catalogue solutions efficiently address the above requirements and act as an enabler towards product management transformation and rapid product launch.   4.0. PLM-based Enterprise Product Management     Figure 5: PLM-based Enterprise Product Mastering   Enterprise Product Management (EPM) enables the business to manage complex product attributes of data in complex environments. Product Mastering helps create a 'single view' of the product by creating a business-driven, IT-supported environment where a global 'single truth record' is created, managed and reused.   4.1 The Business Case for Telco PLM-based solutions for Enterprise Product Management   ·       Telco PLM-based Product Mastering solutions provide a centralized authoring environment for product definition and control of all product data and rules   ·       PLM packages are designed to support multiple perspectives of product data (ordering perspective, billing perspective, provisioning perspective)   ·       Maintains relationships/links between different elements of the entire product definition   ·       Telco PLM packages are specialized in next generation lifecycle management requirements of products such as revision and state management, test and release management, role management and impact analysis)   ·       Takes into consideration all aspects of OSS product requirements compared to CRM product catalogue solutions where the product data managed is mostly order oriented and transactional     ·       New breed of Telco PLM packages are designed with 'open' standards such as SID and eTOM. They are interoperable, support integration frameworks such as subscription and notification.   ·       Telco PLM packages have developed good collaboration frameworks to integrate suppliers and partners into the product development value chain   4.2 Various Architectures/Approaches for Product Mastering using Telco PLM systems   4. 2.a Single Central Product Management (Mastering) Approach   Figure 6: Single Central Product Management (Master) Approach       This approach is implemented across verticals such as aerospace and automotive. It focuses on a physically centralized product master to which other sources are dependent on. The product definition data (Product bundles, service bundles, price plans, offers and discounts, product configuration rules and market campaigns) is created and maintained physically in a centralized environment. In addition, the product definition/authoring environment is centralized. The existing legacy product definition data available in CRM product catalogue, billing catalogue and the legacy product catalogue is migrated to the centralized PLM-based Enterprise Product Management solution.   Architectural changes must be made in the existing business landscape of applications to create and revise data because the applications have to refer to the central repository for approvals and validation of product configurations. It is achieved by modifying how the applications write data or how the applications can be adapted to use the rules to be managed and published.   Complete product configuration validation will be done in enterprise / central product catalogue and final configuration will be sent to the B/OSS system through the SOA compliant product distribution architecture. The approach/architecture enables greater control in terms of product data management and product data governance.   4.2.b Federated Product Management (Mastering) Architecture     Figure 7: Federated Product Management (Mastering) Architecture   In the federated product mastering approach, the basic unique product definition data (product id, description product hierarchy, basic price plans and simple product design rules) will be centrally created and will be maintained. And, the advanced product definition (Product bundling, promotions, offers & discount plans) will be created in respective down stream OSS systems. The advanced product definition (Product bundling, promotions, offers and discount plans) will be created in respective downstream OSS systems.   For example, basic product definitions such as attributes, product hierarchy and basic price plans will be created and maintained in Enterprise/Central product reference catalogue and distributed to downstream OSS systems. Respective downstream OSS systems build product bundles, promotions, advanced price plans over the basic product definition and master the advanced product definition. Central reference database accesses the respective other source product master data and assembles a point-in-time consolidated view of the product. The approach is typically adapted in some merger and acquisition scenarios where there is a low probability of a central physical authority managing the data. In addition, the migration effort in this case is minimal and there are no big architectural changes to the organization application landscape. However, this approach will not result in better product data management and data governance.   5.0 Customer Scenario – Before EPC deployment   A leading global telecommunications service provider wanted to launch a quad play and triple play service offering in the shortest possible lead time. The service provider was offering Broadband and VoIP services to customers. The company wanted to reuse a majority of the Broadband services and price plans and bundle them with new wireless and IPTV services for quad play and triple play. The challenges in launching the new service offerings were:       Figure 8: Triple Play Plan   ·       Broadband product data was stored in multiple product catalogues (CRM catalogue, Billing catalogue, spread sheets)   ·       Product managers spent a lot of time performing tasks involving duplication or re-keying of data. Manual effort caused errors, cost and time over-runs.   ·       No effective product and price data governance mechanism. Price change issues arising from the lack of data consistency across systems resulted in leakage of customer value and revenue.   ·       Product data had re-usability issues and was not in a structured format. It resulted in uncontrolled product portfolio creation and product management issues.   ·       Lack of enterprise product model resulted into product distribution challenges and thus delays in product launch.   ·       Designers are constrained by existing legacy product management solutions to model product/service requirements and product configuration rules such as upgrading, downgrading and cross selling.    5.1 Customer Scenario - After EPC deployment     Figure 9: SOA-based end-to-end EPC Solution   The company deployed PLM-based Enterprise Product Catalogue solutions to launch quad play service after evaluating various product catalogues. The broadband product offering, service and price data were migrated to the new system, and the product and price plan hierarchy for new offerings were created using the entities defined in the Enterprise Product Model. Supplier product catalogue data such as routers and set up boxes were loaded onto the new solution through SOA-based web service. Price plans and configuration rules were built in the new system. The validated final product configurations were extracted from the product catalogue in a SID format and were distributed to the downstream B/OSS systems through exposed SOA-based web services. The transformations required for the B/OSS system were handled using the transformation layer as part of the solution.   6.0 How PLM enabled Product Management Transformation         Figure 10: Product Management Transformation     PLM-based Product Catalogue Solution helped the customer reduce the product launch cycle time by 30% and enable transformation of Product Management for next generation services.   7.0 Conclusion   On the one hand, the telecom industry is undergoing changes due to disruptions, uncertain product markets and increased complexity of products. On the other hand, the ARPU is decreasing year-on-year. Communications Service Providers are embarking on convergence, bundled service offerings, flexibility to cross-sell and up-sell, introduce new value-added services, leverage Web 2.0 concepts and network capabilities. Consequently, large scale IT transformation initiatives to improve their ARPU supporting network and business transformations are a business imperative. Product Management has become a focus area. Companies are investing in best-in- class COTS solutions to reduce time-to-market, ensure rapid service delivery and improve operational efficiency. An efficient PLM-based enterprise product mastering solution plays a key role in achieving zero touch automation and rapid product launch.   References:   1.     Preston G.Smith, Donald G.Reineristsem, Van Nostrand Reinhold “Developing Products in Half the time”.   2.     John G. Innes, "Achieving Successful Product Change", Pitman Publishing.   3.     D T Pham and R M Setchi (16th Jan, 2001) "Authoring environment for documentation development" University of Wales Cardiff, U.K., Proceedings on Institution of Mechanical Engineers, Vol. 215, Part B.   4.     Oracle Product Hub for Communications:   http://www.oracle.com/us/products/applications/master-data-management/product-hub-082059.html  

    Read the article

  • Oracle’s Web Experience Management

    - by Christie Flanagan
    Today’s guest post on Oracle’s Web Experience Management comes from a member of our WebCenter Evangelist team, Noël Jaffré, a Principal Technologist based in France.Oracle’s Web Experience Management (WEM) solution enables organizations to optimize the online channel for driving marketing and customer experience management success. It empowers business users to manage the web presence and create rich and engaging online experiences for customers and prospects. Oracle's WEM platform provides a framework to simplify the integration of Oracle, third-party and custom-built applications. This framework essentially allows the creation and integration of applications using one single business interface called the WEM interface. It includes the following: Single sign-on access control for all integrated applications using the Central Authentication Service (CAS) component. A single centralized administration window for user, role, and native applications management including site management. Community server management, gadget server management as well as management for partner integrated technologies. A Representational State Transfer (REST) API for accessing WebCenter Sites data. REST services are supported on both Oracle WebCenter Sites and Oracle WebCenter Sites Satellite Server to leverage the satellite server cache. All REST requests are cached for web consuming applications as well for the high performance delivery of native applications on the mobile channel. Oracle WebCenter Sites’ Web Experience Management environment enables organizations to deliver a compelling online experience to customers by simplifying the deployment and management of sophisticated and engaging websites. The WebCenter Sites platform automates the entire process of managing web content including: Authoring:  Business users can easily contribute and manage web content in real-time, with intuitive interfaces and drag-and-drop content authoring and layout capabilities designed for the non-technical user. Contextual Content Targeting: Marketers are empowered to create and manage targeted campaigns with relevant recommendations and promotions based on the context of the session of the visitor such as his or her navigation history, user profile, language, location or other information shared during the visitor session. Content Publishing and Deployment: It offers advanced multi-site management capabilities for departmental or regional sites, as well as strong multi-lingual and multi-locale content management. The remote satellite server caching infrastructure provides high-performance, distributed caching, tuned to deliver high-volume, targeted and multi-lingual sites. Analytics and Optimization: Business users and marketers have the ability to measure the effectiveness of their online content and campaigns at a granular level. Editors and marketers can immediately determine whether a given article or promotion is relevant to a particular customer segment. User-generated Content: Marketers can enable blogs, comments, rating and reviews on the website.  All comments and reviews posted to the website can be moderated from the administrator interface either manually or automatically using filters, whitelists, blacklists or community based moderation. Personalized Gadget Dashboards:  Site managers can deploy gadgets, small applications using web content, individually or as part of dashboards containing multiple gadgets.  These gadget dashboards enable site visitors to create their own “MyPage” on a given site where they can select and customize the gadgets that the site administrator has made available.  Any gadget that conforms to the iGoogle/OpenSocial standard can be made available to site visitors, or they can be created within the WEM interface. Oracle's WEM platform also provides a unique environment for the delivery of a rich, multichannel online experience for site visitors through its advanced management modules for mobile. With Oracle’s WEM solution, it’s easy to control branding and deliver a consistent message while repurposing web content for publication to mobile devices, kiosks and much more. This distinctive approach provides: HTML5 Delivery: HTML5 delivery which includes native support for adaptive design that responds to the user’s computer screen resolution and orientation. The approach is less driven by the particular hardware and more driven by the user’s interactions with the device. In other words, this approach takes both the screen interactions (either cursor or touch) and screen sizes and orientation into consideration. A Unique Native Mobile Extension Environment for Contributors: From the WEM interface, a contributor can directly manage their mobile channel, using the tooling already in place for driving the traditional web presence. This includes the mobile presentation, as well as mobile insite editing, drag and drop page layout, and in-context recommendations and personalization. Optimized REST APIs for High Performance Content Delivery on Native Mobile Device Applications: WebCenter Sites’ REST API uses the underlying HTTP methods (GET, POST, PUT, DELETE) to interact with resources. Resources support two types of input and output formats -- XML and JSON. REST calls are customizable to optimize the interactions between the content repositories and the client applications. Caching is essential to decrease network loads and improve overall reliability and usability of the applications and user interactions. REST results are cached through the highly efficient Oracle WebCenter Sites caching architecture.

    Read the article

  • The Customer Experience Imperative: A Game Changer for Brands

    - by Jeri Kelley
    By Anthony Lye, SVP, Cloud Applications Strategy, Oracle We know that customer experience has emerged as a primary differentiator for businesses today.  I’ve talked a lot about the new age of the empowered consumer. At Oracle we’ve spent a lot of time developing technologies and practices that our customers can implement to greatly improve their customer experience strategies. Of course I’m biased, but I think that we have created a portfolio of the best solutions on the planet to help organizations deal with the challenges of providing great customer experiences. We’ve done this because we started to witness some trends over the last few years. As the average person began to utilize social and mobile technologies more frequently and products commoditized, customer experience truly remained the only sustainable differentiator for businesses.In fact, we have seen that customer experience is often driving the success or the failure of a product or a brand. And as end customers have become more vocal about their experiences with companies on social and mobile channels, they now have the power to decide which brands will win and which brands will lose. To address this customer experience imperative, I believe that business today must do three things really well:Connect with your customers. You have to connect with customers whenever, wherever and however they want. Organizations must provide a great experience on their existing channels— the call center, the brick and mortar store, the field sales organizations, the websites and social properties. Businesses must also be great at managing and delivering journeys on these channels, while quickly adapting to embrace the new channels that emerge. You have to understand mobile. You have to understand social. You have to understand kiosks. These are all new routes to market, new channels where your customers may or may not show up. You have to interact with them where they are. You have to present information in a way that's meaningful to them. As well as providing what we would call a multichannel experience. We have to recognize that customers may start their experience on one channel, but end it on a different channel. It’s important that an organization’s technology solutions enable, not just a multichannel strategy, but a strategy that can power new channels and create customer journeys that cross these channels.Get to know your customers. Next, companies need to get to know the customer as intimately as the customer will allow. Today most customer interactions are anonymous, but it’s important for brands to know which customers drive value. Customers want to provide feedback. They want to share their opinions, but they want to know that those opinions are being heard and acted upon. For this to occur, we need to know much more about the customer and then reward them for their loyalty and for their advocacy.Enable connections. The last thing is to enable people to connect or transact with your brand. We've got to make it really, really simple for customers to do business with us. We can't make them repeat the steps; we can't make them tell us their identity for the fifth time as they move between organizations. These silos can no longer sustain or deliver a good customer experience. It's extremely important that companies be where customers want them to be—that we create profitable journeys for us and for them.Organizations have to make sure that there is a single source of truth that defines the customer. We have to make sure that the technology applications that we rely on understand not just the dimensions of multichannel, but of cross-channel too. We have to enable social at the very core of the overall architecture. We have to use historical analytics, real-time decisioning as well as predictive analytics to help personalize and drive an experience. And these are all technologies that IT needs, that IT is familiar with, but needs to enable for the line of business that in turn can enable for the end customer.  This means that we've got to make our solutions available to the customers in the cloud.In this new age of the empowered consumer, businesses have to focus on delivery mechanisms that reduce the overall TCO, while driving a rapid rate of innovation and a more rapid rate of deployment. At the Oracle Customer Experience Summit @ OpenWorld, I’ll discuss these issues and more. I hope that you can join us for what promises to be an unforgettable experience.

    Read the article

  • Drive Online Engagement with Intuitive Portals and Websites

    - by kellsey.ruppel
    As more and more business is being conducted via online channels, engaging users and making them more productive and efficient though these online channels is becoming critical. These users could be customers, partners or employees and while the respective channels through which they interact might be different, these users do increasingly interact with your business through the Web, or mobile devices or now through various social mediums.  Businesses need a user engagement strategy and solution that allows them to deliver targeted and personalized content and applications to users through the various online mediums and touch points.  The customer experience today is made up of an ongoing set of interactions with organizations across many channels, online and offline.  The Direct channel (including sales reps, email and mail) is an important point of contact, as is the Contact Center.  Contact Centers rely on the phone as a means of interacting with customers, and also more now than ever, the Web as well.  However, the online organization is often managed separately from the Contact Center organization within a business. In-store is an important channel for retailers, offering Point-of-Service for human interactions, and Kiosks which enable self-service. Kiosks are a Web-enabled touch point but in-store kiosks are often managed by the head of retail operations, rather than the online organization.  And of course, the online channel, including customer interactions with an organization via digital means -- on the website, mobile websites, and social networking sites, has risen to paramount importance in recent years in the customer experience. Historically all of these channels have been managed separately. The result of all of this fragmentation is that the customer touch points with an organization are siloed.  Their interactions online are not known and respected in their dealings in-store.  Their calls to the contact center are not taken as input into what the website offers them when they arrive. Think of how many times you’ve fallen victim to this. Your experience with the company call center is different than the experience in-store. Your experience with the company website on your desktop computer is different than your experience on your iPad. I think you get the point. But the customer isn’t the only one we need to look at here, as employees and the IT organization have challenges as well when it comes to online engagement. There are many common tools and technologies that organizations have been using to try and engage users, whether it’s customers, employees or partners. Some have adopted different blog and wiki technologies (some hosted, some open source, sometimes embedded in platforms), to things like tagging, file sharing and content management, or composite applications for self-service applications and activity streams. Basically, there are so many different tools & technologies that each address different aspects of user engagement. Now, one of the challenges with this, is that if we look at each individual tool, typically just implementing for example a file sharing and basic collaboration solution, may meet the needs of the business user for one aspect of user engagement, but it may not be the best solution to engage with customers and partners, or it may not fit with IT standards such as integrating with their single sign on tools or their corporate website. Often, the scenario is that businesses are having to acquire multiple pieces and parts as well as build custom applications to meet their needs. Leaving customers and partners with a more fragmented way of interacting with the company. Every organization has some sort of enterprise balancing act between the needs of the business user and the needs and restrictions enforced by enterprise IT groups. As we’ve been discussing, we all know that the expectations for online engagement have changed since the days of the static, one-size fits all website. With these changes have come some very difficult organizational challenges as well. Today, as a business user, you want to engage with your customers, and your customers expect you to know who they are. They expect you to recall the details they’ve provided to you on your website, to your CSRs and to your sales people. They expect you to remember their purchases, their preferences and their problems. And they expect you to know who they are, equally well, across channels, including your web presence. This creates a host of challenges for today’s business users. Delivering targeted, relevant content online is now essential for converting prospects into customers and for engendering long term loyalty. Business users need the ability to leverage customer data from different sources to fuel their segmentation and targeting strategies and to easily set-up, manage and optimize online campaigns. Also critical, they need the ability to accomplish these things on-the-fly, at the speed of the marketplace, while making iterative improvements.  These changing expectations put a host of demands on the IT organization as well. The web presence must be able to scale to support the delivery of personalized and targeted content to thousands of site visitors without sacrificing performance. And integration between systems becomes more important as well, as organizations strive to obtain one view of the customer culled from WCM data, CRM data and more. So then, how do you solve these challenges and meet the growing demands of your users?  You need a solution that: Unifies every customer interaction across all channels Personalizes the products and content that interest the customer and to the device Delivers targeted promotions to the right customer Engages and improve employee productivity Provides self-service access to applications Includes embedded in-context social   So how then do you achieve this level of online engagement, complete customer experience and engage your employees? The answer: Oracle WebCenter. If you want to learn how to get there, we encourage you to attend this webcast on Thursday Drive Online Engagement with Intuitive Portals and Websites, where we'll talk about how you are able to transform your portal experience and optimize online engagement -- making your portals more interactive and more engaging across multiple channels. Register today!

    Read the article

  • RDS installation failure on 2012 R2 Server Core VM in Hyper-V Server

    - by Giles
    I'm currently installing a test-bed for my firms Infrastructure replacement. 10 or so Windows/Linux servers will be replaced by 2 physical servers running Hyper-V server. All services (DC, RDS, SQL) will be on Windows 2012 R2 Server Core VMs, Exchange on Server 2012 R2 GUI, and the rest are things like Elastix, MailArchiver etc, which aren't part of the equation thus far. I have installed Hyper-V server on a test box, and sucessfully got two virtual DC's running, SQL 2014 running, and 8.1 which I use for the RSAT tools. When trying to install RDS (The old fashioned kind, not the newer VDI(?) style), I get a failed installation due to the server not being able to reboot. A couple of articles have said not to do it locally, so I've moved on. Sitting at the Powershell prompt on the Domain Controller or SQL server (Both Server Core), I run the following commands: Import-Module RemoteDesktop New-SessionDeployment -ConnectionBroker "AlstersTS.Alsters.local" -SessionHost "AlstersTS.Alsters.local" The installation begins, carries on for 2 or 3 minutes, then I receive the following error message: New-SessionDeployment : Validation failed for the "RD Connection Broker" parameter. AlstersTS.Alsters.local Unable to connect to the server by using WindowsPowerShell remoting. Verify that you can connect to the server. At line:1 char:1 + NewSessionDeployment -ConnectionBroker "AlstersTS.Alsters.local" -SessionHost " ... + + CategoryInfo : NotSpecified: (:) [Write-Error], WriteErrorException + FullyQualifiedErrorID : Microsoft.PowerShell.Commands.WriteErrorException,New-SessionDeployment So far, I have: Triple, triple checked syntax. Tried various other commands, and a script to accomplish the same task. Checked DNS is functioning as it should. Checked to the best of my knowledge that AD is working as it should. Checked that the Network Service has the needed permissions. Created another VM and placed the two roles on different servers. Deleted all VMs, started again with a new domain name (Lather, rinse, repeat) Performed the whole installation on a second physical box running Hyper-V Server Pleaded with it Interestingly, if I perform the installation via a GUI installation, the thing just works! Now I know I could convert this to a Server Core role after installation, but this wouldn't teach me what was wrong in the first instance. I've probably got 10 pages through various Google searches, each page getting a little less relevant. The closest matches seem to have good information, but it doesn't seem to be the fix for my set-up. As a side note, I expected to be able to "tee" or "out-file" the error message into a text file, but couldn't get that to work either, so I've typed in the error message manually. Chaps, any suggestions, from the glaringly obvious, to the long-winded and complex? Thanks!

    Read the article

  • DisplayPort -> DVI Adapter, Active or Passive?

    - by Aren B
    I read somewhere that you only need an active adapter (usb powered) if you want resolutions greater than 1920x1200. I also heard from someone that you need it when running triple-monitor because more than two displays drains too much power. So to connect a 3rd monitor @ 1920x1200 to my video card's DisplayPort Port, do I need to spend the extra $100 and buy the active adapter?

    Read the article

  • Unable to connect to MSN network with Adium

    - by Adam Driscoll
    I'm new to the Mac world and tried both MSN Messenger for Mac (7.0 and 8.0 beta) as well as Adium to connect to the Windows network. I've enable 'Allow all incoming connections' in the Firewall settings. Windows Live Messenger works fine when connecting through the same router on my Windows laptop. I've triple checked my password and verified it through a web browser and Windows Live. Any ideas what my issue could be?

    Read the article

  • How do I paste into the "crosh" terminal in a Chromebook?

    - by David Faux
    I pressed Ctrl + Alt + t while having Chrome open to open a terminal in my Chromebook. How do I paste content into this terminal copied from another tab in Chrome? I have tried Ctrl + v to no avail. I have also tried highlighting text and pasting it via the middle button on my mouse, which failed too. I have also read this article (http://www.servercobra.com/nothing-but-chromebook-for-a-week/) and tried triple clicking my touch pad, but it isn't working for me.

    Read the article

< Previous Page | 30 31 32 33 34 35 36 37 38 39 40 41  | Next Page >