Search Results

Search found 8976 results on 360 pages for 'advanced customer'.

Page 113/360 | < Previous Page | 109 110 111 112 113 114 115 116 117 118 119 120  | Next Page >

  • Add Artistic Effects to Your Pictures in Office 2010

    - by DigitalGeekery
    Do you ever wish you could add cool effects to images in your Office document pictures, but don’t have access to a graphics editor? Today we take a look at the Artistic Effects featire which is a new feature in Office 2010. Note: We will show you examples in Excel, but the Artistic Effect are available in Word, Excel, and PowerPoint. To insert a picture into your Office document, click the Picture button on the Insert tab. Once you import your picture, the Picture Tools format ribbon should be active. If not, click on the image.     In the Adjust group, click on Artistic Effects. You will see a selection of effects previews images in the dropdown list. Hover your cursor over the effects to use Live Preview to see what your picture will look like if that effect is applied.   When you find an effect you like, just click to apply it to the image. There are also some additional Artistic Effect Options. Each effect will have a it’s own set of available options that can be adjusted by moving the sliders left or right. If you find you want to undo an effect after it has been applied, simply select the None option from the previews under Artistic Effects. Conclusion Artistic Effects provides a really easy way to add professional looking effects to images in Office 2010 without the need to access graphics editing software. Check out some of our other Office 2010 articles like how to use advanced font ligatures, add video from the web to PowerPoint 2010, and preview before you paste in Office 2010. Similar Articles Productive Geek Tips Add Effects To Your Pictures in Word 2007Center Pictures and Other Objects in Office 2007 & 2010Tools to Help Post Content On Your WordPress BlogAdd Classic Polaroid Look to Your Digital picturesGive Your Desktop Artistic Flair with FotoSketcher TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips CloudBerry Online Backup 1.5 for Windows Home Server Snagit 10 VMware Workstation 7 Acronis Online Backup The iPod Revolution Ultimate Boot CD can help when disaster strikes Windows Firewall with Advanced Security – How To Guides Sculptris 1.0, 3D Drawing app AceStock, a Tiny Desktop Quote Monitor Gmail Button Addon (Firefox)

    Read the article

  • General monitoring for SQL Server Analysis Services using Performance Monitor

    - by Testas
    A recent customer engagement required a setup of a monitoring solution for SSAS, due to the time restrictions placed upon this, native Windows Performance Monitor (Perfmon) and SQL Server Profiler Monitoring Tools was used as using a third party tool would have meant the customer providing an additional monitoring server that was not available.I wanted to outline the performance monitoring counters that was used to monitor the system on which SSAS was running. Due to the slow query performance that was occurring during certain scenarios, perfmon was used to establish if any pressure was being placed on the Disk, CPU or Memory subsystem when concurrent connections access the same query, and Profiler to pinpoint how the query was being managed within SSAS, profiler I will leave for another blogThis guide is not designed to provide a definitive list of what should be used when monitoring SSAS, different situations may require the addition or removal of counters as presented by the situation. However I hope that it serves as a good basis for starting your monitoring of SSAS. I would also like to acknowledge Chris Webb’s awesome chapters from “Expert Cube Development” that also helped shape my monitoring strategy:http://cwebbbi.spaces.live.com/blog/cns!7B84B0F2C239489A!6657.entrySimulating ConnectionsTo simulate the additional connections to the SSAS server whilst monitoring, I used ascmd to simulate multiple connections to the typical and worse performing queries that were identified by the customer. A similar sript can be downloaded from codeplex at http://www.codeplex.com/SQLSrvAnalysisSrvcs.     File name: ASCMD_StressTestingScripts.zip. Performance MonitorWithin performance monitor,  a counter log was created that contained the list of counters below. The important point to note when running the counter log is that the RUN AS property within the counter log properties should be changed to an account that has rights to the SSAS instance when monitoring MSAS counters. Failure to do so means that the counter log runs under the system account, no errors or warning are given while running the counter log, and it is not until you need to view the MSAS counters that they will not be displayed if run under the default account that has no right to SSAS. If your connection simulation takes hours, this could prove quite frustrating if not done beforehand JThe counters used……  Object Counter Instance Justification System Processor Queue legnth N/A Indicates how many threads are waiting for execution against the processor. If this counter is consistently higher than around 5 when processor utilization approaches 100%, then this is a good indication that there is more work (active threads) available (ready for execution) than the machine's processors are able to handle. System Context Switches/sec N/A Measures how frequently the processor has to switch from user- to kernel-mode to handle a request from a thread running in user mode. The heavier the workload running on your machine, the higher this counter will generally be, but over long term the value of this counter should remain fairly constant. If this counter suddenly starts increasing however, it may be an indicating of a malfunctioning device, especially if the Processor\Interrupts/sec\(_Total) counter on your machine shows a similar unexplained increase Process % Processor Time sqlservr Definately should be used if Processor\% Processor Time\(_Total) is maxing at 100% to assess the effect of the SQL Server process on the processor Process % Processor Time msmdsrv Definately should be used if Processor\% Processor Time\(_Total) is maxing at 100% to assess the effect of the SQL Server process on the processor Process Working Set sqlservr If the Memory\Available bytes counter is decreaing this counter can be run to indicate if the process is consuming larger and larger amounts of RAM. Process(instance)\Working Set measures the size of the working set for each process, which indicates the number of allocated pages the process can address without generating a page fault. Process Working Set msmdsrv If the Memory\Available bytes counter is decreaing this counter can be run to indicate if the process is consuming larger and larger amounts of RAM. Process(instance)\Working Set measures the size of the working set for each process, which indicates the number of allocated pages the process can address without generating a page fault. Processor % Processor Time _Total and individual cores measures the total utilization of your processor by all running processes. If multi-proc then be mindful only an average is provided Processor % Privileged Time _Total To see how the OS is handling basic IO requests. If kernel mode utilization is high, your machine is likely underpowered as it's too busy handling basic OS housekeeping functions to be able to effectively run other applications. Processor % User Time _Total To see how the applications is interacting from a processor perspective, a high percentage utilisation determine that the server is dealing with too many apps and may require increasing thje hardware or scaling out Processor Interrupts/sec _Total  The average rate, in incidents per second, at which the processor received and serviced hardware interrupts. Shoulr be consistant over time but a sudden unexplained increase could indicate a device malfunction which can be confirmed using the System\Context Switches/sec counter Memory Pages/sec N/A Indicates the rate at which pages are read from or written to disk to resolve hard page faults. This counter is a primary indicator of the kinds of faults that cause system-wide delays, this is the primary counter to watch for indication of possible insufficient RAM to meet your server's needs. A good idea here is to configure a perfmon alert that triggers when the number of pages per second exceeds 50 per paging disk on your system. May also want to see the configuration of the page file on the Server Memory Available Mbytes N/A is the amount of physical memory, in bytes, available to processes running on the computer. if this counter is greater than 10% of the actual RAM in your machine then you probably have more than enough RAM. monitor it regularly to see if any downward trend develops, and set an alert to trigger if it drops below 2% of the installed RAM. Physical Disk Disk Transfers/sec for each physical disk If it goes above 10 disk I/Os per second then you've got poor response time for your disk. Physical Disk Idle Time _total If Disk Transfers/sec is above  25 disk I/Os per second use this counter. which measures the percent time that your hard disk is idle during the measurement interval, and if you see this counter fall below 20% then you've likely got read/write requests queuing up for your disk which is unable to service these requests in a timely fashion. Physical Disk Disk queue legnth For the OLAP and SQL physical disk A value that is consistently less than 2 means that the disk system is handling the IO requests against the physical disk Network Interface Bytes Total/sec For the NIC Should be monitored over a period of time to see if there is anb increase/decrease in network utilisation Network Interface Current Bandwidth For the NIC is an estimate of the current bandwidth of the network interface in bits per second (BPS). MSAS 2005: Memory Memory Limit High KB N/A Shows (as a percentage) the high memory limit configured for SSAS in C:\Program Files\Microsoft SQL Server\MSAS10.MSSQLSERVER\OLAP\Config\msmdsrv.ini MSAS 2005: Memory Memory Limit Low KB N/A Shows (as a percentage) the low memory limit configured for SSAS in C:\Program Files\Microsoft SQL Server\MSAS10.MSSQLSERVER\OLAP\Config\msmdsrv.ini MSAS 2005: Memory Memory Usage KB N/A Displays the memory usage of the server process. MSAS 2005: Memory File Store KB N/A Displays the amount of memory that is reserved for the Cache. Note if total memory limit in the msmdsrv.ini is set to 0, no memory is reserved for the cache MSAS 2005: Storage Engine Query Queries from Cache Direct / sec N/A Displays the rate of queries answered from the cache directly MSAS 2005: Storage Engine Query Queries from Cache Filtered / Sec N/A Displays the Rate of queries answered by filtering existing cache entry. MSAS 2005: Storage Engine Query Queries from File / Sec N/A Displays the Rate of queries answered from files. MSAS 2005: Storage Engine Query Average time /query N/A Displays the average time of a query MSAS 2005: Connection Current connections N/A Displays the number of connections against the SSAS instance MSAS 2005: Connection Requests / sec N/A Displays the rate of query requests per second MSAS 2005: Locks Current Lock Waits N/A Displays thhe number of connections waiting on a lock MSAS 2005: Threads Query Pool job queue Length N/A The number of queries in the job queue MSAS 2005:Proc Aggregations Temp file bytes written/sec N/A Shows the number of bytes of data processed in a temporary file MSAS 2005:Proc Aggregations Temp file rows written/sec N/A Shows the number of bytes of data processed in a temporary file 

    Read the article

  • 11gR2 DB 11.2.0.1 Certified with E-Business Suite on Solaris 10 (x86-64)

    - by Steven Chan
    Oracle Database 11g Release 2 version 11.2.0.1 is now certified with Oracle E-Business Suite 11i (11.5.10.2) and Release 12 (12.0.4 or higher, 12.1.1 or higher) on Oracle Solaris on x86-64 (64-bit) running Solaris 10. This announcement includes:Oracle Database 11gR2 version 11.2.0.1 Oracle Database 11gR2 version 11.2.0.1 Real Application Clusters (RAC) Transparent Data Encryption (TDE) Column Encryption with EBS 11i and R12Advanced Security Option (ASO)/Advanced Networking Option (ANO) Export/Import Process for E-Business Suite 11i and R12 Database Instances Transparent Data Encryption (TDE) Tablespace Encryption

    Read the article

  • Book Review: The Art of XSD - SQL Server XML schemas

    The 14 chapters of "The Art of XSD”, written by MVP Jacob Sebastian, will take the reader step-by–step all the way from the basics of XML Schema design all the way to advanced topics on SQL Server XML Schema Collections. Reviewer Hima Bindu Vejella gives it an 8/10 rating, and gives us an excellent distilled description of what the book has to offer.

    Read the article

  • Book Review: The Art of XSD - SQL Server XML schemas

    The 14 chapters of "The Art of XSD, written by MVP Jacob Sebastian, will take the reader step-bystep all the way from the basics of XML Schema design all the way to advanced topics on SQL Server XML Schema Collections. Reviewer Hima Bindu Vejella gives it an 8/10 rating, and gives us an excellent distilled description of what the book has to offer....Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • How to setup Dual Head with "radeon" driver for R770?

    - by user1709408
    I want to make dual head setup without xrandr but with Xinerama. I put "Screen 1" line into xorg.conf, but card still show identical output on DVI-2 and DVI-3 It is important to use xinerama for me (to glue three monitors), that's why i decide not to use ranrd (randr is incompatible with xinerama as i read somewhere) Here is my videocard (HD 4850 X2): lspci | grep R700 03:00.0 VGA compatible controller: Advanced Micro Devices [AMD] nee ATI R700 [Radeon HD 4850] 04:00.0 Display controller: Advanced Micro Devices [AMD] nee ATI R700 [Radeon HD 4850] Here is how monitors are connected: grep "DVI" /var/log/Xorg.0.log [ 1210.002] (II) RADEON(0): Output DVI-0 using monitor section Monitor0 [ 1210.048] (II) RADEON(0): Output DVI-1 has no monitor section [ 1210.079] (II) RADEON(0): EDID for output DVI-0 [ 1210.080] (II) RADEON(0): Printing probed modes for output DVI-0 [ 1210.128] (II) RADEON(0): EDID for output DVI-1 [ 1210.128] (II) RADEON(0): Output DVI-0 connected [ 1210.128] (II) RADEON(0): Output DVI-1 disconnected [ 1210.128] (II) RADEON(0): Output DVI-0 using initial mode 1920x1200 [ 1210.160] (II) RADEON(1): Output DVI-2 using monitor section Monitor2 [ 1210.215] (II) RADEON(1): Output DVI-3 has no monitor section [ 1210.246] (II) RADEON(1): EDID for output DVI-2 [ 1210.247] (II) RADEON(1): Printing probed modes for output DVI-2 [ 1210.299] (II) RADEON(1): EDID for output DVI-3 [ 1210.300] (II) RADEON(1): Printing probed modes for output DVI-3 [ 1210.300] (II) RADEON(1): Output DVI-2 connected [ 1210.300] (II) RADEON(1): Output DVI-3 connected [ 1210.300] (II) RADEON(1): Output DVI-2 using initial mode 1920x1200 [ 1210.300] (II) RADEON(1): Output DVI-3 using initial mode 1920x1200 Here is my /etc/X11/xorg.conf Section "ServerFlags" Option "RandR" "0" Option "Xinerama" "1" EndSection Section "ServerLayout" Identifier "Three Head Layout" Screen "MyPrecious0" Screen "MyPrecious2" RightOf "MyPrecious0" Screen "MyPrecious3" LeftOf "MyPrecious0" EndSection Section "Screen" Identifier "MyPrecious0" Monitor "Monitor0" Device "Device300" EndSection Section "Screen" Identifier "MyPrecious2" Monitor "Monitor2" Device "Device400" EndSection Section "Screen" Identifier "MyPrecious3" Monitor "Monitor3" Device "Device401" EndSection Section "Device" Identifier "Device300" BusID "PCI:3:0:0" Screen 0 Driver "radeon" EndSection Section "Device" Identifier "Device400" BusID "PCI:4:0:0" Screen 0 Driver "radeon" EndSection Section "Device" Identifier "Device401" BusID "PCI:4:0:0" Screen 1 Driver "radeon" EndSection Section "Monitor" Identifier "Monitor0" EndSection Section "Monitor" Identifier "Monitor2" EndSection Section "Monitor" Identifier "Monitor3" EndSection I tried to switch to vesa driver (didn't work for me) I tried to add options like Option "ZaphodHeads" "DVI-2" and Option "ZaphodHeads" "DVI-3" into sections "Device 400" and "Device 401" (this didn't help because "ZaphodHeads" option is for ranrd, and randr is disabled by decision) I tried to merge sections "Device 400" and "Device 401" into one section and add Option "ZaphodHeads" "DVI-2,DVI-3" (see comment about randr above) single section setup helps to change log line RADEON(1): Output DVI-3 has no monitor section into RADEON(1): Output DVI-3 using monitor section Monitor3 but nothing was enough to switch from screen cloning to separate screens. This problem (lack of documentation on radeon driver) is similar to these: Radeon display driver clones monitors while using Xinerama (moderators decision to close that problem was wrong) Ubuntu 12.10 multi-monitor setup isn't working The problem is solvable, because this hardware worked as three headed for me earlier with gentoo/xorg-server-1.3 Xorg -configure creates setup for the first monitor on the first GPU Please don't advise to use fglrx/aticonfig/amdcccle (this goes against my religion beliefs)

    Read the article

  • How to configure VPN in Windows XP

    - by SAMIR BHOGAYTA
    VPN Overview A VPN is a private network created over a public one. It’s done with encryption, this way, your data is encapsulated and secure in transit – this creates the ‘virtual’ tunnel. A VPN is a method of connecting to a private network by a public network like the Internet. An internet connection in a company is common. An Internet connection in a Home is common too. With both of these, you could create an encrypted tunnel between them and pass traffic, safely - securely. If you want to create a VPN connection you will have to use encryption to make sure that others cannot intercept the data in transit while traversing the Internet. Windows XP provides a certain level of security by using Point-to-Point Tunneling Protocol (PPTP) or Layer Two Tunneling Protocol (L2TP). They are both considered tunneling protocols – simply because they create that virtual tunnel just discussed, by applying encryption. Configure a VPN with XP If you want to configure a VPN connection from a Windows XP client computer you only need what comes with the Operating System itself, it's all built right in. To set up a connection to a VPN, do the following: 1. On the computer that is running Windows XP, confirm that the connection to the Internet is correctly configured. • You can try to browse the internet • Ping a known host on the Internet, like yahoo.com, something that isn’t blocking ICMP 2. Click Start, and then click Control Panel. 3. In Control Panel, double click Network Connections 4. Click Create a new connection in the Network Tasks task pad 5. In the Network Connection Wizard, click Next. 6. Click Connect to the network at my workplace, and then click Next. 7. Click Virtual Private Network connection, and then click Next. 8. If you are prompted, you need to select whether you will use a dialup connection or if you have a dedicated connection to the Internet either via Cable, DSL, T1, Satellite, etc. Click Next. 9. Type a host name, IP or any other description you would like to appear in the Network Connections area. You can change this later if you want. Click Next. 10. Type the host name or the Internet Protocol (IP) address of the computer that you want to connect to, and then click Next. 11. You may be asked if you want to use a Smart Card or not. 12. You are just about done, the rest of the screens just verify your connection, click Next. 13. Click to select the Add a shortcut to this connection to my desktop check box if you want one, if not, then leave it unchecked and click finish. 14. You are now done making your connection, but by default, it may try to connect. You can either try the connection now if you know its valid, if not, then just close it down for now. 15. In the Network Connections window, right-click the new connection and select properties. Let’s take a look at how you can customize this connection before it’s used. 16. The first tab you will see if the General Tab. This only covers the name of the connection, which you can also rename from the Network Connection dialog box by right clicking the connection and selecting to rename it. You can also configure a First connect, which means that Windows can connect the public network (like the Internet) before starting to attempt the ‘VPN’ connection. This is a perfect example as to when you would have configured the dialup connection; this would have been the first thing that you would have to do. It's simple, you have to be connected to the Internet first before you can encrypt and send data over it. This setting makes sure that this is a reality for you. 17. The next tab is the Options Tab. It is The Options tab has a lot you can configure in it. For one, you have the option to connect to a Windows Domain, if you select this check box (unchecked by default), then your VPN client will request Windows logon domain information while starting to work up the VPN connection. Also, you have options here for redialing. Redial attempts are configured here if you are using a dial up connection to get to the Internet. It is very handy to redial if the line is dropped as dropped lines are very common. 18. The next tab is the Security Tab. This is where you would configure basic security for the VPN client. This is where you would set any advanced IPSec configurations other security protocols as well as requiring encryption and credentials. 19. The next tab is the Networking Tab. This is where you can select what networking items are used by this VPN connection. 20. The Last tab is the Advanced Tab. This is where you can configure options for configuring a firewall, and/or sharing. Connecting to Corporate Now that you have your XP VPN client all set up and ready, the next step is to attempt a connection to the Remote Access or VPN server set up at the corporate office. To use the connection follow these simple steps. To open the client again, go back to the Network Connections dialog box. 1. One you are in the Network Connection dialog box, double-click, or right click and select ‘Connect’ from the menu – this will initiate the connection to the corporate office. 2. Type your user name and password, and then click Connect. Properties bring you back to what we just discussed in this article, all the global settings for the VPN client you are using. 3. To disconnect from a VPN connection, right-click the icon for the connection, and then click “Disconnect” Summary In this article we covered the basics of building a VPN connection using Windows XP. This is very handy when you have a VPN device but don’t have the ‘client’ that may come with it. If the VPN Server doesn’t use highly proprietary protocols, then you can use the XP client to connect with. In a future article I will get into the nuts and bolts of both IPSec and more detail on how to configure the advanced options in the Security tab of this client. 678: The remote computer did not respond. 930: The authentication server did not respond to authentication requests in a timely fashion. 800: Unable to establish the VPN connection. 623: The system could not find the phone book entry for this connection. 720: A connection to the remote computer could not be established. More on : http://www.windowsecurity.com/articles/Configure-VPN-Connection-Windows-XP.html

    Read the article

  • Who Do You Turn To for Your Consumer Goods Sales and Marketing Needs

    - by ruth.donohue
    As a sales or marketing executive, you want the best software for managing your marketing, demand generation, trade promotion, customer/volume planning, and retail execution/monitoring activities and analysis. However, working with niche software vendors can result in a very disjointed user and support experience. It would be ideal to have just one end-to-end solution that could manage and optimize each of these processes...but is that just wishful thinking? Read this Gartner article to find out more!

    Read the article

  • Spotlight on a career path: Paul, Business Development Consultant

    - by Maria Sandu
    I came to work for Oracle in November 2012 as a Customer Intelligence Representative and since then I was promoted to a Business Development Consultant, for Commercial Industries in the UK, based in Dublin. My background was primarily in Logistics, working for such companies as Indaver Ireland, Wincanton and P&O. I spent 10 years working in this industry and gained experience in negotiating with customers and suppliers in order to meet the needs of both, monitoring the quality and quantity of goods as well as the efficiency and organisation of the movement and storage of products. I decided to move from my logistics career in 2009 to study Information Technology in D.I.T. This was a challenge for me to move my career path; however the lectures at the college helped me significantly with the ability to understand how IT can have an effect on how businesses operate. Following on from college I came to work for Oracle. This also presented challenges but the training I received and the encouragement from management helped me understand that the same business rules apply no matter what background you come from. I have also learnt that using my past experience in working with customers and suppliers in Logistics has helped me understand how to meet customer’s needs. Oracle has offered me excellent training such as Sandler Sales Techniques and John Costigan. I continue to get all the training that I need to develop my career. If you’re interested in joining the Business Development Group visit http://bit.ly/oracledirectcareers or follow our CareersatOracle Facebook Community! /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

    Read the article

  • BAM design pointers

    - by Kavitha Srinivasan
    In working recently with a large Oracle customer on SOA and BAM, I discovered that some BAM best practices are not quite well known as I had always assumed ! There is a doc bug out to formally incorporate those learnings but here are a few notes..  EMS-DO parity When using EMS (Enterprise Message Source) as a BAM feed, the best practice is to use one EMS to write to one Data Object. There is a possibility of collisions and duplicates when multiple EMS write to the same row of a DO at the same time. This customer had 17 EMS writing to one DO at the same time. Every sensor in their BPEL process writes to one topic but the Topic was read by 1 EMS corresponding to one sensor. They then used XSL within BAM to transform the payload into the BAM DO format. And hence for a given BPEL instance, 17 sensors fired, populated 1 JMS topic, was consumed by 17 EMS which in turn wrote to 1 DataObject.(You can image what would happen for later versions of the application that needs to send more information to BAM !).  We modified their design to use one Master XSL based on sensorname for all sensors relating to a DO- say Data Object 'Orders' and were able to thus reduce the 17 EMS to 1 with a master XSL. For those of you wondering about how squeaky clean this design is, you are right ! This is indeed not squeaky clean and that brings us to yet another 'inferred' best practice. (I try very hard not to state the obvious in my blogs with the hope that everytime I blog, it is very useful but this one is an exception.) Transformations and Calculations It is optimal to do transformations within an engine like BPEL. Not only does this provide modelling ease with a nice GUI XSL mapper in JDeveloper, the XSL engine in BPEL is quite efficient at runtime as well. And so, doing XSL transformations in BAM is not quite prudent.  The same is true for any non-trivial calculations as well. It is best to do all transformations,calcuations and sanitize the data in a BPEL or like layer and then send this to BAM (via JMS, WS etc.) This then delegates simply the function of report rendering and mechanics of real-time reporting to the Oracle BAM reporting tool which it is most suited to do. All nulls are not created equal Here is yet another possibly known fact but reiterated here. For an EMS with an Upsert operation: a) If Empty tags or tags with no value are sent like <Tag1/> or <Tag1></Tag1>, the DO will be overwritten with --null-- b) If Empty tags are suppressed ie not generated at all, the corresponding DO field will NOT be overwritten. The field will have whatever value existed previously.  For an EMS with an Insert operation, both tags with an empty value and no tags result in –null-- being written to the DO. Hope this helps .. Happy 4th!

    Read the article

  • Out-of-the-Box Spatial Dashboards Improve Utility Outage Decisions

    - by stephen.garth
    Oracle Utilities Advanced Spatial Outage Analytics leverages the capabilities of Oracle Business Intelligence with map visualization and geospatial analysis of outage data from utility network management systems, providing BI dashboards to support utility executives and other decision makers throughout the enterprise. This excellent article by Oracle's Guerry Waters, published by Directions Media, gives details. Read the article here. Get more information: - Oracle Spatial - Oracle Utilities - Oracle Business Intelligence

    Read the article

  • Oracle Announces Availability of Oracle Exaskeleton with Extreme Scale

    - by J Swaroop
    Re-posting Bruce Tierney's original post - albeit a day late: I reckon this is Oracle's most interesting launch this year. Enjoy! The World’s First Human Scale Body Surface (HSBS) Designed to Toughen Spineless Wimps April 1, 2012 Building on the success of Oracle Exalogic, Oracle Exadata, and Oracle Exalytics, Oracle today announced the general availability of Oracle Exaskeleton, toughening up spineless wimps across the globe through the introduction of extreme scalability over the human body leveraging a revolutionary new technology called Human Scale Body Surface (HSBS). First Customer Ship (FCS) was received by the little known and mostly unsuccessful superhero Awkwardman. After applying Oracle Exaskeleton with extreme scale, he has since rebranded himself as Aquaman. Said Aquaman, “I used to feel so helpless in my skin…now I feel like…well…a highly scaled Engineered System thanks to Oracle!” Thousand of meek and mild individuals eagerly lined up outside Oracle Corporation’s Redwood Shores office to purchase the new Oracle Exaskeleton, with the hope of finally gaining the spine they never had. Unfortunately for the individuals, a bully was spotted allegedly kicking the sand covering the beaches of Redwood Shores into the still spineless Exaskeleton hopefuls. Supporting Quotes “Industry analysts are inquiring if Oracle Exaskeleton is a radical departure from Oracle’s traditional enterprise focus into new markets”, said Oracle representative Sabrina Twich, “Oracle has extensive expertise in unified backbone solutions for application infrastructures…this is simply a new port to the human body combining our Business Intelligence (BI) and RDBC (Remote Direct Brain Cell) technologies.” “With this release of Oracle Exaskeleton, Oracle has redefined scalability. Software and hardware vendors had it all wrong” said the Director of Oracle Exaskeleton, “Scalability for hardware is like…um…you know…so scale-ful. No, wait…can I say that again? I didn’t get that right…Scalability is hardware-on-demand with public and private…hybrid clouds, no…<long pause>…Scalability for… nevermind, I don’t want to be in this stupid press release anyway” Releases An upcoming Oracle Exaskeleton service pack release will include a new datasheet with an extensive library of three-letter acronyms (TLAs) as well as the introduction of more four-letter acronyms (FLAs) since technologies vendors have used up almost all of the 17,576 TLA permutations (TLAPs). About Oracle Oracle engineers hardware and software to work together in the cloud and in your data center. It would be an amazing coincidence if any of this is true in some secret Oracle lab, but I doubt it. Trademarks Really…you’re still reading this? Cool! Aquaman - First Customer Ship (FCS) - Oracle Exaskeleton

    Read the article

  • Engagement: Don’t Forget Your Employees!

    - by Kellsey Ruppel
    By Mark Brown, Sr. Director, Oracle WebCenter  This week we want to focus on Employee Engagement, and how it is critical to your business. Today we hear and read a great deal about “Customer Engagement” – and rightly so, it is those customers, whether they be traditional paying customers, citizens, students, club members, or whomever it is that are “paying the bills”.  A more engaged customer is more likely to make it easier to pay those bills by buying more, giving good reviews, or spreading the word of how wonderful their experience was. But what about those who are providing those services, those who design and make those goods; why is it that all too often they are left out of conversations concerning engagement? In fact, it is critical that we consider our employees as customers since they are using internal systems that run your organization the same way customers use external systems. Studies have shown that an organization in which the employees feel “engaged” or better able to make decisions, do their jobs, and are connected to their peers have better return to their stakeholders. (shareholders).  On the surface this seems obvious, happy employees are more productive employees. But it leads to the question – how many of our existing policies, systems and processes are actually reducing that level of engagement? Let’s look at a couple examples. If posting new information that may be of great value to everyone in the larger organization is hard to do because we use an antiquated system, then we’re making it hard to share and increasing the potential for duplicate work. If it is not trivially obvious how to create and publish this post, then chances are very high that I’ll put it on the bottom of my queue. And finally, when critical information is spread across various systems, intranet sites, workgroups and peoples inboxes, then it is very hard to learn and grow from that information.  These may sound trivial, but how often do we push things off not because it is intellectually challenging, we may have the answer at our fingertips, but because it is hard to make that information readily available.  If an engaged employee is a productive employee, then what can we do to increase their level of engagement? We can start by looking for opportunities to provide self-documenting self-service solutions. Our newer employees grew up using simplified web interfaces everyday and they loathe calling a help-desk unless it is the last resort. Sadly, many of our enterprise applications have not kept pace and we all still have processes that are based on sending an email -- like discount approvals, vacation requests, or even offer-letter approvals.   My suggestion is to pick one highly visible, high-impact process where employees are either reticent to execute on the process or openly complain about how cumbersome it is and look at the mechanism for that process. If there are better ways, streamlined steps, better UIs that could be done, then you have a candidate to reconfigure that process and make it more engaging. Looking to better engage your employees? Start here!

    Read the article

  • Daily tech links for .net and related technologies - May 10-12, 2010

    - by SanjeevAgarwal
    Daily tech links for .net and related technologies - May 10-12, 2010 Web Development jQuery Templates and Data Linking (and Microsoft contributing to jQuery) - ScottGu ASP.NET MVC and jQuery Part 4 – Advanced Model Binding - Mister James Creating an ASP.NET report using Visual Studio 2010 - Part 1 & Part 2 & Part 3 - rajbk Caching Images in ASP.NET MVC -Evan How to Localize an ASP.NET MVC Application - mikeceranski Localization in ASP.NET MVC 2 using ModelMetadata - Raj Kiamal Web Design...(read more)

    Read the article

  • Should you buy an ATI Radeon x1200 driver?

    If you are looking for a good graphics driver, the choices available to you will boggle your mind. Advanced Micro Devices (AMD) has joined up with ATI Technologies to make the most cutting edge graph... [Author: Sunny Makkar - Computers and Internet - March 20, 2010]

    Read the article

  • Are You In The Know About Knowledge?

    - by [email protected]
    "Knowledge is of two kinds. We know a subject ourselves, or we know where we can find information on it." To me, this simple and elegant quote from the great English author Samuel Johnson is a reflection of Oracle's knowledge base strategy. The knowledge base in the My Oracle Support portal (https://support.oracle.com) hosts nearly a half million documents, including how-to instructions, problem-solution descriptions, code samples, FAQs, critical alerts, technical whitepapers, and so on. AutoVue's footprint in the Oracle knowledge base - although relatively small at just around 400 documents - is a steadily-expanding assortment of valuable info. This information is designed to complement what you have already learned from the AutoVue documentation, or in some cases, to examine topics not yet covered in the documentation. Similar to the documentation, the knowledge base is one of the highest-value self-service avenues, since it delivers answers in real-time and is driven by the topics most relevant to customers. There are many different ways to leverage the AutoVue knowledge content, or what Oracle often refers to as "KM Notes": 1. Knowledge Browser: To browse the knowledge hierarchy, click on the 'Knowledge' tab at the top of the My Oracle Support webpage. In the list of product areas at the left, click on 'More Applications', then on 'Oracle AutoVue'. From here, you can either view the full set of KM Notes under the AutoVue product family (AutoVue, VueLink, Web Services, Document Print Services, etc) by clicking on 'All of Oracle AutoVue', or you can drill down further by clicking on 'Enterprise Visualization'. 2. Search: To execute simple keyword searches, use the Search bar at the top-right of the My Oracle Support webpage: 3. Advanced Search: Beside the same Search bar at the top-right of the My Oracle Support webpage, click on the 'Advanced' link in order to increase your control over the search string as well as the product to search against: 4. In your Dashboard: By clicking on the 'Customize' link at the top-right of the Dashboard page in My Oracle Support, you can drag & drop multiple "Knowledge Articles" widgets onto your dashboard. Then, click on the pencil icon at the top-right of the widget to customize it by product. This allows you to keep an active monitor on the most recently updated KM Notes across any product: 5. During SR Creation: As you submit a new Service Request, after entering the product information, SR title, and SR description, you will be presented with a frame at the left containing KM Note suggestions based on the information entered: Let Oracle know what you think! If you like or dislike an article, or would like to comment on how easy/difficult it was to find the article, click on the "Rate this document" link at the bottom of the KM Note. Similarly, during SR creation if one of the suggested KM Notes resolves your question/issue, you can click the "This article solved my problem" link at the bottom of the page. I hope these approaches improve your ability find knowledge content within the My Oracle Support portal, and I encourage you to continue to build your knowledge to further your success with the AutoVue product family.

    Read the article

  • Hadoop growing pains

    - by Piotr Rodak
    This post is not going to be about SQL Server. I have been reading recently more and more about “Big Data” – very catchy term that describes untamed increase of the data that mankind is producing each day and the struggle to capture the meaning of these data. Ten years ago, and perhaps even three years ago this need was not so recognized. Increasing number of smartphones and discernable trend of mainstream Internet traffic moving to the smartphone generated one means that there is bigger and bigger stream of information that has to be stored, transformed, analysed and perhaps monetized. The nature of this traffic makes if very difficult to wrap it into boundaries of relational database engines. The amount of data makes it near to impossible to process them in relational databases within reasonable time. This is where ‘cloud’ technologies come to play. I just read a good article about the growing pains of Hadoop, which became one of the leading players on distributed processing arena within last year or two. Toby Baer concludes in it that lack of enterprise ready toolsets hinders Hadoop’s apprehension in the enterprise world. While this is true, something else drew my attention. According to the article there are already about half of a dozen of commercially supported distributions of Hadoop. For me, who has not been involved into intricacies of open-source world, this is quite interesting observation. On one hand, it is good that there is competition as it is beneficial in the end to the customer. On the other hand, the customer is faced with difficulty of choosing the right distribution. In future, when Hadoop distributions fork even more, this choice will be even harder. The distributions will have overlapping sets of features, yet will be quite incompatible with each other. I suppose it will take a few years until leaders emerge and the market will begin to resemble what we see in Linux world. There are myriads of distributions, but only few are acknowledged by the industry as enterprise standard. Others are honed by bearded individuals with too much time to spend. In any way, the third fact I can’t help but notice about the proliferation of distributions of Hadoop is that IT professionals will have jobs.   BuzzNet Tags: Hadoop,Big Data,Enterprise IT

    Read the article

  • Genesis WordPress Theme Framework

    - by Edward
    Genesis is a Progressive & Advanced WordPress theme framework from StudioPress. The Genesis Theme Framework is basically a functional WordPress theme with a considerable amount of inbuilt options and useful features which can be extended further with the use of child themes (or skins). With Genesis as a theme framework, StudioPress can provide the key [...] Related posts:iQ2 Photoblog WordPress Theme 21+ WordPress Photo Blog & Portfolio Themes Beveled Premium WordPress Theme by Woothemes

    Read the article

  • SQL SERVER – Faster SQL Server Databases and Applications – Power and Control with SafePeak Caching Options

    - by Pinal Dave
    Update: This blog post is written based on the SafePeak, which is available for free download. Today, I’d like to examine more closely one of my preferred technologies for accelerating SQL Server databases, SafePeak. Safepeak’s software provides a variety of advanced data caching options, techniques and tools to accelerate the performance and scalability of SQL Server databases and applications. I’d like to look more closely at some of these options, as some of these capabilities could help you address lagging database and performance on your systems. To better understand the available options, it is best to start by understanding the difference between the usual “Basic Caching” vs. SafePeak’s “Dynamic Caching”. Basic Caching Basic Caching (or the stale and static cache) is an ability to put the results from a query into cache for a certain period of time. It is based on TTL, or Time-to-live, and is designed to stay in cache no matter what happens to the data. For example, although the actual data can be modified due to DML commands (update/insert/delete), the cache will still hold the same obsolete query data. Meaning that with the Basic Caching is really static / stale cache.  As you can tell, this approach has its limitations. Dynamic Caching Dynamic Caching (or the non-stale cache) is an ability to put the results from a query into cache while maintaining the cache transaction awareness looking for possible data modifications. The modifications can come as a result of: DML commands (update/insert/delete), indirect modifications due to triggers on other tables, executions of stored procedures with internal DML commands complex cases of stored procedures with multiple levels of internal stored procedures logic. When data modification commands arrive, the caching system identifies the related cache items and evicts them from cache immediately. In the dynamic caching option the TTL setting still exists, although its importance is reduced, since the main factor for cache invalidation (or cache eviction) become the actual data updates commands. Now that we have a basic understanding of the differences between “basic” and “dynamic” caching, let’s dive in deeper. SafePeak: A comprehensive and versatile caching platform SafePeak comes with a wide range of caching options. Some of SafePeak’s caching options are automated, while others require manual configuration. Together they provide a complete solution for IT and Data managers to reach excellent performance acceleration and application scalability for  a wide range of business cases and applications. Automated caching of SQL Queries: Fully/semi-automated caching of all “read” SQL queries, containing any types of data, including Blobs, XMLs, Texts as well as all other standard data types. SafePeak automatically analyzes the incoming queries, categorizes them into SQL Patterns, identifying directly and indirectly accessed tables, views, functions and stored procedures; Automated caching of Stored Procedures: Fully or semi-automated caching of all read” stored procedures, including procedures with complex sub-procedure logic as well as procedures with complex dynamic SQL code. All procedures are analyzed in advance by SafePeak’s  Metadata-Learning process, their SQL schemas are parsed – resulting with a full understanding of the underlying code, objects dependencies (tables, views, functions, sub-procedures) enabling automated or semi-automated (manually review and activate by a mouse-click) cache activation, with full understanding of the transaction logic for cache real-time invalidation; Transaction aware cache: Automated cache awareness for SQL transactions (SQL and in-procs); Dynamic SQL Caching: Procedures with dynamic SQL are pre-parsed, enabling easy cache configuration, eliminating SQL Server load for parsing time and delivering high response time value even in most complicated use-cases; Fully Automated Caching: SQL Patterns (including SQL queries and stored procedures) that are categorized by SafePeak as “read and deterministic” are automatically activated for caching; Semi-Automated Caching: SQL Patterns categorized as “Read and Non deterministic” are patterns of SQL queries and stored procedures that contain reference to non-deterministic functions, like getdate(). Such SQL Patterns are reviewed by the SafePeak administrator and in usually most of them are activated manually for caching (point and click activation); Fully Dynamic Caching: Automated detection of all dependent tables in each SQL Pattern, with automated real-time eviction of the relevant cache items in the event of “write” commands (a DML or a stored procedure) to one of relevant tables. A default setting; Semi Dynamic Caching: A manual cache configuration option enabling reducing the sensitivity of specific SQL Patterns to “write” commands to certain tables/views. An optimization technique relevant for cases when the query data is either known to be static (like archive order details), or when the application sensitivity to fresh data is not critical and can be stale for short period of time (gaining better performance and reduced load); Scheduled Cache Eviction: A manual cache configuration option enabling scheduling SQL Pattern cache eviction based on certain time(s) during a day. A very useful optimization technique when (for example) certain SQL Patterns can be cached but are time sensitive. Example: “select customers that today is their birthday”, an SQL with getdate() function, which can and should be cached, but the data stays relevant only until 00:00 (midnight); Parsing Exceptions Management: Stored procedures that were not fully parsed by SafePeak (due to too complex dynamic SQL or unfamiliar syntax), are signed as “Dynamic Objects” with highest transaction safety settings (such as: Full global cache eviction, DDL Check = lock cache and check for schema changes, and more). The SafePeak solution points the user to the Dynamic Objects that are important for cache effectiveness, provides easy configuration interface, allowing you to improve cache hits and reduce cache global evictions. Usually this is the first configuration in a deployment; Overriding Settings of Stored Procedures: Override the settings of stored procedures (or other object types) for cache optimization. For example, in case a stored procedure SP1 has an “insert” into table T1, it will not be allowed to be cached. However, it is possible that T1 is just a “logging or instrumentation” table left by developers. By overriding the settings a user can allow caching of the problematic stored procedure; Advanced Cache Warm-Up: Creating an XML-based list of queries and stored procedure (with lists of parameters) for periodically automated pre-fetching and caching. An advanced tool allowing you to handle more rare but very performance sensitive queries pre-fetch them into cache allowing high performance for users’ data access; Configuration Driven by Deep SQL Analytics: All SQL queries are continuously logged and analyzed, providing users with deep SQL Analytics and Performance Monitoring. Reduce troubleshooting from days to minutes with database objects and SQL Patterns heat-map. The performance driven configuration helps you to focus on the most important settings that bring you the highest performance gains. Use of SafePeak SQL Analytics allows continuous performance monitoring and analysis, easy identification of bottlenecks of both real-time and historical data; Cloud Ready: Available for instant deployment on Amazon Web Services (AWS). As you can see, there are many options to configure SafePeak’s SQL Server database and application acceleration caching technology to best fit a lot of situations. If you’re not familiar with their technology, they offer free-trial software you can download that comes with a free “help session” to help get you started. You can access the free trial here. Also, SafePeak is available to use on Amazon Cloud. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Performance, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • Can LittleBigPlanet2's engine be used for other ?

    - by Bill
    LittleBigPlanet2 just came out. I've worked with the original LBP level editor a bit and really enjoyed it. I've read that LBP2's featureset in the game is much richer; is it possible to use these advanced features to create different sorts of game other than just a regular platformer? I imagine that something along the lines of a Breakout clone would definitely be manageable, but I'm interested in hearing more about the capabilities of the platform.

    Read the article

  • AIIM, Oracle and Keste - Talking Social Business in LA

    - by Brian Dirking
    We had a great event today in Los Angeles - AIIM, Oracle and Keste presented on how organizations are making social business work. Atle Skjekkeland of AIIM presented How Social Business Is Driving Innovation. Atle talked about a number of fascinating points, such as how answers to questions come from unexpected sources. Atle cited the fact that 38% of organizations get half or more of answers from unexpected sources, which speaks to the wisdom of the crowds and how people are benefiting from open communications tools to get answers to their questions. He also had a number of hilarious examples of companies that don't get it. If Comcast were to go to YouTube and search Comcast, they would see the number one hit after their paid ad is a video of one of their technicians asleep on a customer's couch. Seems when he called the office for support he was put on hold so long he fell asleep. Dan O'Leary and Atle Skjekkeland After Atle's presentation I presented on Solving the Innovation Challenge with Oracle WebCenter. Atle had talked about McKinsey's research titled The Rise Of The Networked Enterprise: Web 2.0 Finds Its Payday. I brought in some new McKinsey research that built on that article. The new article is How Social Technologies Are Extending The Organization. A survey of 4,200 Global Executives brought three conclusions for the future: Boundaries among employees, vendors and customers will blur Employee teams will self-organize Data-driven decisions will rise These three items were themes that repeated through the day as we went through examples of what customers are doing today.  Next up was Vince Casarez of Keste. Vince was scheduled to profile one customer, but in an incredible 3 for 1 deal, Vince profiled Alcatel-Lucent, Qualcomm, and NetApp. Each of these implementations had content consolidation elements, as well as user engagement requirements that Keste was able to address with Oracle WebCenter. Vince Casarez of Keste And we had a couple of good tweets worth reprinting here. danieloleary Daniel O'Leary Learning about user engagement and social platforms from @bdirking #AIIM LA and @oracle event pic.twitter.com/1aNcLEUs danieloleary Daniel O'Leary Users want to be able to share data and activity streams, work at organizations that embrace social via @bdirking skjekkeland Atle Skjekkeland RT @danieloleary: Learning about user engagement and social platforms from @bdirking #AIIM LA and @oracle event pic.twitter.com/EWRYpvJa danieloleary Daniel O'Leary Thanks again to @bdirking for an amazing event in LA today, really impressed with the completeness of web center JimLundy Jim Lundy @ @danieloleary @bdirking yes, it is looking good - Web Center shadrachwhite Shadrach White @ @bdirking @heybenito I heard the #AIIM event in LA was a hit We had some great conversations through they day, many thanks to everyone who joined in. We look forward to continuing the conversation - thanks again to everyone who attended!

    Read the article

  • First-Global-Teach for the Oracle Imaging and Process Management 11g: Administration: San Francisco

    - by stephen.schleifer
    First-Global-Teach for the Oracle Imaging and Process Management 11g: Administration: San Francisco | June 23-25 This course enables participants to use Oracle Imaging and Process Management (I/PM) 11g to access, track, and annotate documents. In addition, they also get an overview of the product architecture of Oracle I/PM running on Oracle WebLogic Server.The course also delves into administration tasks such as security permissions, configuration such as creating BPEL connections, and procedures for creating applications, searches, and input mappings. Customer and partners can register by looking up the course (#D61575GC10) on http://education.oracle.com

    Read the article

  • Learning Resources for SharePoint

    - by Enrique Lima
    SharePoint 2010 Reference: Software Development Kit SharePoint 2010: Getting Started with Development on SharePoint 2010 Hands-on Labs in C# and Visual Basic SharePoint Developer Training Kit Professional Development Evaluation Guide and Walkthrough SharePoint Server 2010: Advanced Developer Training Presentations

    Read the article

  • Ideas for extending tic-tac-toe game?

    - by pimvdb
    I'm building a 3D tic-tac-toe game and this is what I've implemented so far: 3D renderer with texture mapping Playing against the computer Playing online (multiplayer) Now I'm a little lost what I could add. Obviously, tic-tac-toe isn't that exciting or advanced, but I just miss something to salt it a little bit. Therefore, could anyone please suggest some ideas that would be worth implementing? Thanks!

    Read the article

< Previous Page | 109 110 111 112 113 114 115 116 117 118 119 120  | Next Page >