Search Results

Search found 8990 results on 360 pages for 'customer contact'.

Page 50/360 | < Previous Page | 46 47 48 49 50 51 52 53 54 55 56 57  | Next Page >

  • Using Table-Valued Parameters With SQL Server Reporting Services

    - by Jesse
    In my last post I talked about using table-valued parameters to pass a list of integer values to a stored procedure without resorting to using comma-delimited strings and parsing out each value into a TABLE variable. In this post I’ll extend the “Customer Transaction Summary” report example to see how we might leverage this same stored procedure from within an SQL Server Reporting Services (SSRS) report. I’ve worked with SSRS off and on for the past several years and have generally found it to be a very useful tool for building nice-looking reports for end users quickly and easily. That said, I’ve been frustrated by SSRS from time to time when seemingly simple things are difficult to accomplish or simply not supported at all. I thought that using table-valued parameters from within a SSRS report would be simple, but unfortunately I was wrong. Customer Transaction Summary Example Let’s take the “Customer Transaction Summary” report example from the last post and try to plug that same stored procedure into an SSRS report. Our report will have three parameters: Start Date – beginning of the date range for which the report will summarize customer transactions End Date – end of the date range for which the report will summarize customer transactions Customer Ids – One or more customer Ids representing the customers that will be included in the report The simplest way to get started with this report will be to create a new dataset and point it at our Customer Transaction Summary report stored procedure (note that I’m using SSRS 2012 in the screenshots below, but there should be little to no difference with SSRS 2008): When you initially create this dataset the SSRS designer will try to invoke the stored procedure to determine what the parameters and output fields are for you automatically. As part of this process the following dialog pops-up: Obviously I can’t use this dialog to specify a value for the ‘@customerIds’ parameter since it is of the IntegerListTableType user-defined type that we created in the last post. Unfortunately this really throws the SSRS designer for a loop, and regardless of what combination of Data Type, Pass Null Value, or Parameter Value I used here, I kept getting this error dialog with the message, "Operand type clash: nvarchar is incompatible with IntegerListTableType". This error message makes some sense considering that the nvarchar type is indeed incompatible with the IntegerListTableType, but there’s little clue given as to how to remedy the situation. I don’t know for sure, but I think that behind-the-scenes the SSRS designer is trying to give the @customerIds parameter an nvarchar-typed SqlParameter which is causing the issue. When I first saw this error I figured that this might just be a limitation of the dataset designer and that I’d be able to work around the issue by manually defining the parameters. I know that there are some special steps that need to be taken when invoking a stored procedure with a table-valued parameter from ADO .NET, so I figured that I might be able to use some custom code embedded in the report  to create a SqlParameter instance with the needed properties and value to make this work, but the “Operand type clash" error message persisted. The Text Query Approach Just because we’re using a stored procedure to create the dataset for this report doesn’t mean that we can’t use the ‘Text’ Query Type option and construct an EXEC statement that will invoke the stored procedure. In order for this to work properly the EXEC statement will also need to declare and populate an IntegerListTableType variable to pass into the stored procedure. Before I go any further I want to make one point clear: this is a really ugly hack and it makes me cringe to do it. Simply put, I strongly feel that it should not be this difficult to use a table-valued parameter with SSRS. With that said, let’s take a look at what we’ll have to do to make this work. Manually Define Parameters First, we’ll need to manually define the parameters for report by right-clicking on the ‘Parameters’ folder in the ‘Report Data’ window. We’ll need to define the ‘@startDate’ and ‘@endDate’ as simple date parameters. We’ll also create a parameter called ‘@customerIds’ that will be a mutli-valued Integer parameter: In the ‘Available Values’ tab we’ll point this parameter at a simple dataset that just returns the CustomerId and CustomerName of each row in the Customers table of the database or manually define a handful of Customer Id values to make available when the report runs. Once we have these parameters properly defined we can take another crack at creating the dataset that will invoke the ‘rpt_CustomerTransactionSummary’ stored procedure. This time we’ll choose the ‘Text’ query type option and put the following into the ‘Query’ text area: 1: exec('declare @customerIdList IntegerListTableType ' + @customerIdInserts + 2: ' EXEC rpt_CustomerTransactionSummary 3: @startDate=''' + @startDate + ''', 4: @endDate='''+ @endDate + ''', 5: @customerIds=@customerIdList')   By using the ‘Text’ query type we can enter any arbitrary SQL that we we want to and then use parameters and string concatenation to inject pieces of that query at run time. It can be a bit tricky to parse this out at first glance, but from the SSRS designer’s point of view this query defines three parameters: @customerIdInserts – This will be a Text parameter that we use to define INSERT statements that will populate the @customerIdList variable that is being declared in the SQL. This parameter won’t actually ever get passed into the stored procedure. I’ll go into how this will work in a bit. @startDate – This is a simple date parameter that will get passed through directly into the @startDate parameter of the stored procedure on line 3. @endDate – This is another simple data parameter that will get passed through into the @endDate parameter of the stored procedure on line 4. At this point the dataset designer will be able to correctly parse the query and should even be able to detect the fields that the stored procedure will return without needing to specify any values for query when prompted to. Once the dataset has been correctly defined we’ll have a @customerIdInserts parameter listed in the ‘Parameters’ tab of the dataset designer. We need to define an expression for this parameter that will take the values selected by the user for the ‘@customerIds’ parameter that we defined earlier and convert them into INSERT statements that will populate the @customerIdList variable that we defined in our Text query. In order to do this we’ll need to add some custom code to our report using the ‘Report Properties’ dialog: Any custom code defined in the Report Properties dialog gets embedded into the .rdl of the report itself and (unfortunately) must be written in VB .NET. Note that you can also add references to custom .NET assemblies (which could be written in any language), but that’s outside the scope of this post so we’ll stick with the “quick and dirty” VB .NET approach for now. Here’s the VB .NET code (note that any embedded code that you add here must be defined in a static/shared function, though you can define as many functions as you want): 1: Public Shared Function BuildIntegerListInserts(ByVal variableName As String, ByVal paramValues As Object()) As String 2: Dim insertStatements As New System.Text.StringBuilder() 3: For Each paramValue As Object In paramValues 4: insertStatements.AppendLine(String.Format("INSERT {0} VALUES ({1})", variableName, paramValue)) 5: Next 6: Return insertStatements.ToString() 7: End Function   This method takes a variable name and an array of objects. We use an array of objects here because that is how SSRS will pass us the values that were selected by the user at run-time. The method uses a StringBuilder to construct INSERT statements that will insert each value from the object array into the provided variable name. Once this method has been defined in the custom code for the report we can go back into the dataset designer’s Parameters tab and update the expression for the ‘@customerIdInserts’ parameter by clicking on the button with the “function” symbol that appears to the right of the parameter value. We’ll set the expression to: 1: =Code.BuildIntegerListInserts("@customerIdList ", Parameters!customerIds.Value)   In order to invoke our custom code method we simply need to invoke “Code.<method name>” and pass in any needed parameters. The first parameter needs to match the name of the IntegerListTableType variable that we used in the EXEC statement of our query. The second parameter will come from the Value property of the ‘@customerIds’ parameter (this evaluates to an object array at run time). Finally, we’ll need to edit the properties of the ‘@customerIdInserts’ parameter on the report to mark it as a nullable internal parameter so that users aren’t prompted to provide a value for it when running the report. Limitations And Final Thoughts When I first started looking into the text query approach described above I wondered if there might be an upper limit to the size of the string that can be used to run a report. Obviously, the size of the actual query could increase pretty dramatically if you have a parameter that has a lot of potential values or you need to support several different table-valued parameters in the same query. I tested the example Customer Transaction Summary report with 1000 selected customers without any issue, but your mileage may vary depending on how much data you might need to pass into your query. If you think that the text query hack is a lot of work just to use a table-valued parameter, I agree! I think that it should be a lot easier than this to use a table-valued parameter from within SSRS, but so far I haven’t found a better way. It might be possible to create some custom .NET code that could build the EXEC statement for a given set of parameters automatically, but exploring that will have to wait for another post. For now, unless there’s a really compelling reason or requirement to use table-valued parameters from SSRS reports I would probably stick with the tried and true “join-multi-valued-parameter-to-CSV-and-split-in-the-query” approach for using mutli-valued parameters in a stored procedure.

    Read the article

  • SQL SERVER – 2011 – Introduction to SEQUENCE – Simple Example of SEQUENCE

    - by pinaldave
    SQL Server 2011 will contain one of the very interesting feature called SEQUENCE. I have waited for this feature for really long time. I am glad it is here finally. SEQUENCE allows you to define a single point of repository where SQL Server will maintain in memory counter. USE AdventureWorks2008R2 GO CREATE SEQUENCE [Seq] AS [int] START WITH 1 INCREMENT BY 1 MAXVALUE 20000 GO SEQUENCE is very interesting concept and I will write few blog post on this subject in future. Today we will see only working example of the same. Let us create a sequence. We can specify various values like start value, increment value as well maxvalue. -- First Run SELECT NEXT VALUE FOR Seq, c.CustomerID FROM Sales.Customer c GO -- Second Run SELECT NEXT VALUE FOR Seq, c.AccountNumber FROM Sales.Customer c GO Once the sequence is defined, it can be fetched using following method. Every single time new incremental value is provided, irrespective of sessions. Sequence will generate values till the max value specified. Once the max value is reached, query will stop and will return error message. Msg 11728, Level 16, State 1, Line 2 The sequence object ‘Seq’ has reached its minimum or maximum value. Restart the sequence object to allow new values to be generated. We can restart the sequence from any particular value and it will work fine. -- Restart the Sequence ALTER SEQUENCE [Seq] RESTART WITH 1 GO -- Sequence Restarted SELECT NEXT VALUE FOR Seq, c.CustomerID FROM Sales.Customer c GO Let us do final clean up. -- Clean Up DROP SEQUENCE [Seq] GO There are lots of things one can find useful about this feature. We will see that in future posts. Here is the complete code for easy reference. USE AdventureWorks2008R2 GO CREATE SEQUENCE [Seq] AS [int] START WITH 1 INCREMENT BY 1 MAXVALUE 20000 GO -- First Run SELECT NEXT VALUE FOR Seq, c.CustomerID FROM Sales.Customer c GO -- Second Run SELECT NEXT VALUE FOR Seq, c.AccountNumber FROM Sales.Customer c GO -- Restart the Sequence ALTER SEQUENCE [Seq] RESTART WITH 1 GO -- Sequence Restarted SELECT NEXT VALUE FOR Seq, c.CustomerID FROM Sales.Customer c GO -- Clean Up DROP SEQUENCE [Seq] GO Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, SQLServer, T SQL, Technology

    Read the article

  • The Benefits of Smart Grid Business Software

    - by Sylvie MacKenzie, PMP
    Smart Grid Background What Are Smart Grids?Smart Grids use computer hardware and software, sensors, controls, and telecommunications equipment and services to: Link customers to information that helps them manage consumption and use electricity wisely. Enable customers to respond to utility notices in ways that help minimize the duration of overloads, bottlenecks, and outages. Provide utilities with information that helps them improve performance and control costs. What Is Driving Smart Grid Development? Environmental ImpactSmart Grid development is picking up speed because of the widespread interest in reducing the negative impact that energy use has on the environment. Smart Grids use technology to drive efficiencies in transmission, distribution, and consumption. As a result, utilities can serve customers’ power needs with fewer generating plants, fewer transmission and distribution assets,and lower overall generation. With the possible exception of wind farm sprawl, landscape preservation is one obvious benefit. And because most generation today results in greenhouse gas emissions, Smart Grids reduce air pollution and the potential for global climate change.Smart Grids also more easily accommodate the technical difficulties of integrating intermittent renewable resources like wind and solar into the grid, providing further greenhouse gas reductions. CostsThe ability to defer the cost of plant and grid expansion is a major benefit to both utilities and customers. Utilities do not need to use as many internal resources for traditional infrastructure project planning and management. Large T&D infrastructure expansion costs are not passed on to customers.Smart Grids will not eliminate capital expansion, of course. Transmission corridors to connect renewable generation with customers will require major near-term expenditures. Additionally, in the future, electricity to satisfy the needs of population growth and additional applications will exceed the capacity reductions available through the Smart Grid. At that point, expansion will resume—but with greater overall T&D efficiency based on demand response, load control, and many other Smart Grid technologies and business processes. Energy efficiency is a second area of Smart Grid cost saving of particular relevance to customers. The timely and detailed information Smart Grids provide encourages customers to limit waste, adopt energy-efficient building codes and standards, and invest in energy efficient appliances. Efficiency may or may not lower customer bills because customer efficiency savings may be offset by higher costs in generation fuels or carbon taxes. It is clear, however, that bills will be lower with efficiency than without it. Utility Operations Smart Grids can serve as the central focus of utility initiatives to improve business processes. Many utilities have long “wish lists” of projects and applications they would like to fund in order to improve customer service or ease staff’s burden of repetitious work, but they have difficulty cost-justifying the changes, especially in the short term. Adding Smart Grid benefits to the cost/benefit analysis frequently tips the scales in favor of the change and can also significantly reduce payback periods.Mobile workforce applications and asset management applications work together to deploy assets and then to maintain, repair, and replace them. Many additional benefits result—for instance, increased productivity and fuel savings from better routing. Similarly, customer portals that provide customers with near-real-time information can also encourage online payments, thus lowering billing costs. Utilities can and should include these cost and service improvements in the list of Smart Grid benefits. What Is Smart Grid Business Software? Smart Grid business software gathers data from a Smart Grid and uses it improve a utility’s business processes. Smart Grid business software also helps utilities provide relevant information to customers who can then use it to reduce their own consumption and improve their environmental profiles. Smart Grid Business Software Minimizes the Impact of Peak Demand Utilities must size their assets to accommodate their highest peak demand. The higher the peak rises above base demand: The more assets a utility must build that are used only for brief periods—an inefficient use of capital. The higher the utility’s risk profile rises given the uncertainties surrounding the time needed for permitting, building, and recouping costs. The higher the costs for utilities to purchase supply, because generators can charge more for contracts and spot supply during high-demand periods. Smart Grids enable a variety of programs that reduce peak demand, including: Time-of-use pricing and critical peak pricing—programs that charge customers more when they consume electricity during peak periods. Pilot projects indicate that these programs are successful in flattening peaks, thus ensuring better use of existing T&D and generation assets. Direct load control, which lets utilities reduce or eliminate electricity flow to customer equipment (such as air conditioners). Contracts govern the terms and conditions of these turn-offs. Indirect load control, which signals customers to reduce the use of on-premises equipment for contractually agreed-on time periods. Smart Grid business software enables utilities to impose penalties on customers who do not comply with their contracts. Smart Grids also help utilities manage peaks with existing assets by enabling: Real-time asset monitoring and control. In this application, advanced sensors safely enable dynamic capacity load limits, ensuring that all grid assets can be used to their maximum capacity during peak demand periods. Real-time asset monitoring and control applications also detect the location of excessive losses and pinpoint need for mitigation and asset replacements. As a result, utilities reduce outage risk and guard against excess capacity or “over-build”. Better peak demand analysis. As a result: Distribution planners can better size equipment (e.g. transformers) to avoid over-building. Operations engineers can identify and resolve bottlenecks and other inefficiencies that may cause or exacerbate peaks. As above, the result is a reduction in the tendency to over-build. Supply managers can more closely match procurement with delivery. As a result, they can fine-tune supply portfolios, reducing the tendency to over-contract for peak supply and reducing the need to resort to spot market purchases during high peaks. Smart Grids can help lower the cost of remaining peaks by: Standardizing interconnections for new distributed resources (such as electricity storage devices). Placing the interconnections where needed to support anticipated grid congestion. Smart Grid Business Software Lowers the Cost of Field Services By processing Smart Grid data through their business software, utilities can reduce such field costs as: Vegetation management. Smart Grids can pinpoint momentary interruptions and tree-caused outages. Spatial mash-up tools leverage GIS models of tree growth for targeted vegetation management. This reduces the cost of unnecessary tree trimming. Service vehicle fuel. Many utility service calls are “false alarms.” Checking meter status before dispatching crews prevents many unnecessary “truck rolls.” Similarly, crews use far less fuel when Smart Grid sensors can pinpoint a problem and mobile workforce applications can then route them directly to it. Smart Grid Business Software Ensures Regulatory Compliance Smart Grids can ensure compliance with private contracts and with regional, national, or international requirements by: Monitoring fulfillment of contract terms. Utilities can use one-hour interval meters to ensure that interruptible (“non-core”) customers actually reduce or eliminate deliveries as required. They can use the information to levy fines against contract violators. Monitoring regulations imposed on customers, such as maximum use during specific time periods. Using accurate time-stamped event history derived from intelligent devices distributed throughout the smart grid to monitor and report reliability statistics and risk compliance. Automating business processes and activities that ensure compliance with security and reliability measures (e.g. NERC-CIP 2-9). Grid Business Software Strengthens Utilities’ Connection to Customers While Reducing Customer Service Costs During outages, Smart Grid business software can: Identify outages more quickly. Software uses sensors to pinpoint outages and nested outage locations. They also permit utilities to ensure outage resolution at every meter location. Size outages more accurately, permitting utilities to dispatch crews that have the skills needed, in appropriate numbers. Provide updates on outage location and expected duration. This information helps call centers inform customers about the timing of service restoration. Smart Grids also facilitates display of outage maps for customer and public-service use. Smart Grids can significantly reduce the cost to: Connect and disconnect customers. Meters capable of remote disconnect can virtually eliminate the costs of field crews and vehicles previously required to change service from the old to the new residents of a metered property or disconnect customers for nonpayment. Resolve reports of voltage fluctuation. Smart Grids gather and report voltage and power quality data from meters and grid sensors, enabling utilities to pinpoint reported problems or resolve them before customers complain. Detect and resolve non-technical losses (e.g. theft). Smart Grids can identify illegal attempts to reconnect meters or to use electricity in supposedly vacant premises. They can also detect theft by comparing flows through delivery assets with billed consumption. Smart Grids also facilitate outreach to customers. By monitoring and analyzing consumption over time, utilities can: Identify customers with unusually high usage and contact them before they receive a bill. They can also suggest conservation techniques that might help to limit consumption. This can head off “high bill” complaints to the contact center. Note that such “high usage” or “additional charges apply because you are out of range” notices—frequently via text messaging—are already common among mobile phone providers. Help customers identify appropriate bill payment alternatives (budget billing, prepayment, etc.). Help customers find and reduce causes of over-consumption. There’s no waiting for bills in the mail before they even understand there is a problem. Utilities benefit not just through improved customer relations but also through limiting the size of bills from customers who might struggle to pay them. Where permitted, Smart Grids can open the doors to such new utility service offerings as: Monitoring properties. Landlords reduce costs of vacant properties when utilities notify them of unexpected energy or water consumption. Utilities can perform similar services for owners of vacation properties or the adult children of aging parents. Monitoring equipment. Power-use patterns can reveal a need for equipment maintenance. Smart Grids permit utilities to alert owners or managers to a need for maintenance or replacement. Facilitating home and small-business networks. Smart Grids can provide a gateway to equipment networks that automate control or let owners access equipment remotely. They also facilitate net metering, offering some utilities a path toward involvement in small-scale solar or wind generation. Prepayment plans that do not need special meters. Smart Grid Business Software Helps Customers Control Energy Costs There is no end to the ways Smart Grids help both small and large customers control energy costs. For instance: Multi-premises customers appreciate having all meters read on the same day so that they can more easily compare consumption at various sites. Customers in competitive regions can match their consumption profile (detailed via Smart Grid data) with specific offerings from competitive suppliers. Customers seeing inexplicable consumption patterns and power quality problems may investigate further. The result can be discovery of electrical problems that can be resolved through rewiring or maintenance—before more serious fires or accidents happen. Smart Grid Business Software Facilitates Use of Renewables Generation from wind and solar resources is a popular alternative to fossil fuel generation, which emits greenhouse gases. Wind and solar generation may also increase energy security in regions that currently import fossil fuel for use in generation. Utilities face many technical issues as they attempt to integrate intermittent resource generation into traditional grids, which traditionally handle only fully dispatchable generation. Smart Grid business software helps solves many of these issues by: Detecting sudden drops in production from renewables-generated electricity (wind and solar) and automatically triggering electricity storage and smart appliance response to compensate as needed. Supporting industry-standard distributed generation interconnection processes to reduce interconnection costs and avoid adding renewable supplies to locations already subject to grid congestion. Facilitating modeling and monitoring of locally generated supply from renewables and thus helping to maximize their use. Increasing the efficiency of “net metering” (through which utilities can use electricity generated by customers) by: Providing data for analysis. Integrating the production and consumption aspects of customer accounts. During non-peak periods, such techniques enable utilities to increase the percent of renewable generation in their supply mix. During peak periods, Smart Grid business software controls circuit reconfiguration to maximize available capacity. Conclusion Utility missions are changing. Yesterday, they focused on delivery of reasonably priced energy and water. Tomorrow, their missions will expand to encompass sustainable use and environmental improvement.Smart Grids are key to helping utilities achieve this expanded mission. But they come at a relatively high price. Utilities will need to invest heavily in new hardware, software, business process development, and staff training. Customer investments in home area networks and smart appliances will be large. Learning to change the energy and water consumption habits of a lifetime could ultimately prove even more formidable tasks.Smart Grid business software can ease the cost and difficulties inherent in a needed transition to a more flexible, reliable, responsive electricity grid. Justifying its implementation, however, requires a full understanding of the benefits it brings—benefits that can ultimately help customers, utilities, communities, and the world address global issues like energy security and climate change while minimizing costs and maximizing customer convenience. This white paper is available for download here. For further information about Oracle's Primavera Solutions for Utilities, please read our Utilities e-book.

    Read the article

  • Business Objects - Containers or functional?

    - by Walter
    This is a question I asked a while back on SO, but it may get discussed better here... Where I work, we've gone back and forth on this subject a number of times and are looking for a sanity check. Here's the question: Should Business Objects be data containers (more like DTOs) or should they also contain logic that can perform some functionality on that object. Example - Take a customer object, it probably contains some common properties (Name, Id, etc), should that customer object also include functions (Save, Calc, etc.)? One line of reasoning says separate the object from the functionality (single responsibility principal) and put the functionality in a Business Logic layer or object. The other line of reasoning says, no, if I have a customer object I just want to call Customer.Save and be done with it. Why do I need to know about another class to save a customer if I'm consuming the object? Our last two projects have had the objects separated from the functionality, but the debate has been raised again on a new project. Which makes more sense and why??

    Read the article

  • 2 eventos, 2 países, 1 jornada.

    - by Noelia Gomez
    Normal 0 21 false false false ES X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-family:"Calibri","sans-serif";} El pasado Martes 23 de Octubre fue un día de gran actividad tanto en España como en Portugal. El Dialogo CxO , organizado por Econique, y en el que participó Oracle, tuvo lugar en Madrid en el Hotel Puerta de Ámerica. Este encuentro tenía como objetivo intercambiar opiniones sobre todos los aspectos relacionados con la gestión estratégica de clientes y el Contact Centre. En este marco, los asistentes tuvieron la oportunidad de realizar reuniones “one to one” con nuestros mejores expertos. Además Oracle presentó dos coloquios relacionados con la visión de las "Nuevas necesidades, estrategias y tendencias en la gestión del Marketing", de la mano de Gema Sebastian, Principal Sales Consultant de Oracle. En dichos coloquios los participantes de empresas, como Caprabo, Carrefour, Endesa, Jaguar Land Rover y Repsol (entre otros) trataron temas de máxima actualidad para los directivos de Marketing. Esta mesa redonda se centró sobre todo en el Marketing en redes sociales, compartiendo entre todos nuestra percepción de que es algo necesario pero que todavía el mercado no sabe muy bien cómo tratar. La escucha activa dentro de las redes y la posibilidad de reaccionar ante determinados factores se veía como un claro punto donde comenzar a trabajar de manera activa y donde Oracle puede ayudar. La experiencia de cliente fue otro de los puntos tratados en esta mesa, donde se dejó claro que ahora es el consumidor el que manda, el que quiere ver las cosas donde quiere y como quiere y que un mensaje de marketing ha de darse en el momento adecuado y aportando un valor real para que el consumidor lo acepte como algo interesante. Igualmente Oracle dispone de herramientas para hacer que esto sea posible. Por otro lado, en Lisboa, tenía lugar el Total Training 2012, una conferencia organizada por el Grupo IFE. En ella participaron más de 100 profesionales de los recursos humanos de las empresas más importantes de Portugal y tuvo como base de partida los conocimientos y experiencias, el intercambio de ideas y la discusión de oportunidades a las que actualmente se enfrentan los profesionales de este área. En este marco Oracle realizó una ponencia sobre “Los nuevos conceptos en RRHH”, de la mano de Julio Rodriguez, Principal Sales Consultant de Oracle, y que puso de manifiesto algunos conceptos tecnológicos relevantes para la gestión del talento que por su novedad, no eran muy conocidos por los profesionales de los RRHH cómo: · Saas (Software as a service) · BI (Business Intelligence) para RRHH · Social Networking y cómo integrarla dentro de la empresa · El mapa del talento, por fin fuera del Excel y en una aplicación · La movilidad en las aplicaciones de RRHH. Sin duda, esta fue una jornada cargada de intercambio de experiencias y de conocimientos para dos grandes áreas: los Recursos Humanos y la Gestión Estratégica del cliente. Si quieres saber más sobre la experiencia del cliente: Customer Concepts Magazine Customer Concepts Exchange in LinkedIn Customer Concepts Web TV Customer Experience @ Oracle.com Customer Experience Facebook Hub Customer Experience YouTube Channel Customer Experience Twitter Puede conocer más sobre HCM (Gestión de RRHH): Oracle Fusion Applications Oracle Fusion Human Capital Management Oracle PartnerNetwork Oracle Consulting Services Oracle Human Capital Management Blog Oracle HCM on Twitter Oracle HCM on Facebook

    Read the article

  • Oracle and Eloqua Welcome Compendium’s Content Marketing

    - by Mike Stiles
    Yesterday, Oracle announced its acquisition of Compendium, a cloud-based content marketing provider that helps companies plan, produce and deliver engaging content across multiple channels throughout their customers' lifecycle. Why? Because every part of the above paragraph speaks to where modern marketing is and where it’s headed. Customers have now been empowered, thanks to the Internet and particularly social, with access to almost limitless amounts of information about companies and products. This includes the especially influential voices of friends and objective acquaintances that have experience with the product or brand. With mobile, this info is available instantly in the palm of their hand. All of this research and influence mind you, is taking place long before a prospect will ever engage with the brand itself or one of its sales reps. So how does a brand effectively insert itself into these conversations and this flow of the customer journey? Now, more than ever, marketers must deliver relevant and engaging content across multiple channels and throughout the entire customer journey to be useful, helpful, and influential. Compendium has a data-driven content marketing platform that lines up relevant content with customer data and personas so brands can accelerate the conversion of prospects. Now think about combining that with the Oracle Eloqua Marketing Cloud, part of Oracle's comprehensive CX solution. Marketers will be able to automate content delivery across channels by aligning persona-based content with customers' digital body language. Better customer engagement, improved sales lead quality, better return on marketing investment, and higher customer loyalty. Now we’re talking. Does data-driven content marketing have an impact? Compendium customer CVENT is a SaaS company specializing in meetings management tech. They wanted to increase leads & ad performance on their blog and dramatically increase their content. They also wanted to manage the creation, workflow, promotion and distribution of that content. With Compendium, CVENT created over 9,000 content elements, and sales-ready leads grew 325%. So Oracle Eloqua helps you target audiences, know buyers, and automate multi-channel marketing campaigns. Compendium lets you plan, publish, manage and measure content across content types and channels. Now kick it up yet another notch with Oracle’s Analytics, Big Data and Social solutions, and you’re using your marketing dollars to reach the right people in the right place at the right time with the right content. And as if that weren’t enough, your customers will love you for it. @mikestiles

    Read the article

  • Large invoice database structure and rendering

    - by user132624
    Our client has a MS SQL database that has 1 million customer invoice records in it. Using the database, our client wants its customers to be able to log into a frontend web site and then be able to view, modify and download their company’s invoices. Given the size of the database and the large number of customers who may log into the web site at any time, we are concerned about data base engine performance and web page invoice rendering performance. The 1 million invoice database is for just 90 days sales, so we will remove invoices over 90 days old from the database. Most of the invoices have multiple line items. We can easily convert our invoices into various data formats so for example it is easy for us to convert to and from SQL to XML with related schema and XSLT. Any data conversion would be done on another server so as not to burden the web interface server. We have tentatively decided to run the web site on a .NET Framework IIS web server using MS SQL on MS Azure. How would you suggest we structure our database for best performance? For example, should we put all the invoices of all customers located within the same 5 digit or 6 digit zip codes into the same table? Or could we set up a separate home directory for each customer on IIS and place each customer’s invoices in each customer’s home directory in XML format? And secondly what would you suggest would be the best method to render customer invoices on a web page and allow customers to modify for best performance? The ADO.net XML Data Set looks intriguing to us as a method, but we have never used it.

    Read the article

  • How Are Businesses Advancing with the Experience Revolution?

    - by Charles Knapp
    Businesses worldwide are operating in a new era. Customers are taking charge of their relationships with brands, and the customer experience has become the most important differentiator and driver of business value. Where is the experience heading? And how can businesses take advantage of the customer experience revolution? Find out from the experts at a one-of-a-kind event: the Oracle Customer Experience Summit at Oracle OpenWorld, San Francisco, October 3-5. Our featured speakers are global visionaries including Seth Godin, George Kembel from the Stanford d:School, Bruce Temkin, Kerry Bodine and Paul Hagen from Forrester, and Gene Alvarez from Gartner. Featured industry leaders will include speakers from Athene Group, Bazaarvoice, Comcast, Consortium of Service Innovation, Haworth, Intuit, KPN, Marriott, Nikon, Quicksilver, Royal Caribbean, SapientNitro, Southwest, Stryker, Stuart Concannon, and Twilio. Featured speakers from Oracle will include Oracle President Mark Hurd, Anthony Lye, David Vap, Brian Curran, John Kembel, and Matthew Banks. So, please join us at the Customer Experience Summit at the Oracle OpenWorld Conference.

    Read the article

  • Learn Best Practices at Oracle OpenWorld

    - by Oracle OpenWorld Blog Team
    By Joan JenkinsOracle Advanced Customer Support Services Knows BestLearn key best practices to maximize performance and availability from Oracle Advanced Customer Support Services. Plan to attend one or more of our sessions, with topics including Oracle Exadata best practices, Oracle E-Business Suite upgrades, Oracle GoldenGate, and Oracle Platinum Services. Or stop by the Support Stars Bar to ask questions and get more information. Find out more what you can learn from Oracle Advanced Customer Support Services at Oracle OpenWorld.

    Read the article

  • Here's your chance: MOS Feedback Sessions @OOW

    - by cwarticki
    Bring your questions, comments, concerns, opinions, recommendations, enhancement requests and any emotional outbursts!   As I travel the world and speak to thousands of customers, I receive plenty of feedback about My Oracle support.  Come hear directly from the source. Meet Dennis Reno, VP of Customer Portal Experience. The Customer Portal Experience team will host a My Oracle Support Tips and Techniques session and three roundtable feedback sessions at this year’s Oracle OpenWorld. The sessions will include a Hardware Support component, as well as best practices that are sure to benefit all My Oracle Support users. The events planned will give our users the opportunity to learn more about how the My Oracle Support customer portal adds value to the support process and to their business needs. The roundtable feedback sessions will allow customers to meet, give feedback, and share their experiences directly with the team responsible for the customer portal experience. Date Time (PT) Session Name Mon, Oct 1 01:45 PM My Oracle Support: Tips and Techniques for Getting the Best Hardware Support Possible (Session #CON9745) Tue, Oct 2 11:00 AM Roundtable - My Oracle Support General Feedback Wed, Oct 3 11:00 AM Roundtable - My Oracle Support Community Feedback Thr, Oct 4 11:00 AM Roundtable - My Oracle Support General Feedback Customers can find more information, including specific details about how to attend, by accessing My Oracle Support at OpenWorld (Article ID 1484508.1). Enjoy OpenWorld everyone! -Chris Warticki Global Customer Management

    Read the article

  • apt-get install kernel source error

    - by MA1
    apt-get source linux-image-$(uname -r) gives me the below error Reading package lists... Done Building dependency tree Reading state information... Done Picking 'linux' as source package instead of 'linux-image-3.0.0-12-generic' NOTICE: 'linux' packaging is maintained in the 'Git' version control system at: http://kernel.ubuntu.com/git-repos/ubuntu/ubuntu-oneiric.git Skipping already downloaded file 'linux_3.0.0-17.30.dsc' Need to get 99.9 MB of source archives. Err http://pk.archive.ubuntu.com/ubuntu/ oneiric-updates/main linux 3.0.0-17.30 (tar) 500 ( The request was rejected by the HTTP filter. Contact your ISA Server administrator. ) Err http://pk.archive.ubuntu.com/ubuntu/ oneiric-updates/main linux 3.0.0-17.30 (diff) 500 ( The request was rejected by the HTTP filter. Contact your ISA Server administrator. ) Failed to fetch http://pk.archive.ubuntu.com/ubuntu/pool/main/l/linux/linux_3.0.0.orig.tar.gz 500 ( The request was rejected by the HTTP filter. Contact your ISA Server administratorThe request was rejected by the HTTP filter. Contact your ISA Server administrator. ) Failed to fetch http://pk.archive.ubuntu.com/ubuntu/pool/main/l/linux/linux_3.0.0-17.30.diff.gz 500 ( The request was rejected by the HTTP filter. Contact your ISA Server administrator. ) E: Failed to fetch some archives. Can someone suggest a solution?

    Read the article

  • New at TRC: Networking Products

    - by uwes
    The new category "Networking Products" has been added last week at Oracle Hardware Technical Resource Center (HW TRC). The following list summarize the different areas which are included. Feel free to explore. Oracle Virtual Networking customer and technical presentation, Datasheets, partner FAQ and more 10 GbE Network Adapters and Switches customer and technical presentations, Datasheets, partner FAQ, Documentation and more Gigabit Ethernet customer presentations, partner FAQ, Documentation and more InfiniBand Datasheets, partner FAQ and Documentation Blade Server Network Express Modules (NEMs) technical presentation, Datasheets, partner FAQ, White Paper and more Storage Networking customer presentations, Datasheets, partner FAQ and more Please be aware that you need to be registered at the Oracel HW TRC. To register click here ... and follow the instructions..

    Read the article

  • Love and Hate Outlook autocomplete, Outlook 2010/Exchange 2010

    - by Kay Sellenrode
    I think that almost every Exchange admin can concur with me that the Outlook autocomplete cache is one of those things you love but at the same time also hate. Users mostly love this function, except when it fails.Luckily since Outlook 2010 things got a little better and we got rid of the dreaded nk2 files.Outlook 2010 now includes a folder named "Suggested Contacts", all users you send an email to and that don't already have an contact object are saved in this suggested contacts folder.A lot of people thought this folder is also the source for the autocomplete cache, which would make it somewhat easy to manage, I wish the solution was that easy.Badly enough separate from the suggested contacts, outlook still maintains a cache for the autocomplete function. Let us say you run in to the following situation: John works for company A and is a popular contact for almost everyone in your organization.Now John quit his job at Company A and moved to Company B.Luckily John maintains your company as customer, but his email address is now changed from companyA.com to companyB.comSince you don't want to do any business with Company A anymore, you want to make sure none of your users accidentally mail to his old address.Now this is where the real fun starts, cause almost all of your 1000 users have mailed at least once with John.Resulting in the fact that every user has John most probably listed in their autocomplete cache.  I have run into sort like situations multiple times with several customers, which is always a pain.And of course this blog post is the result of one of those issues once again.I knew that with the Suggested contacts we could do more than previously, but still never spent time on it before.But today I thought lets nail this now and forever!!  Ok let's start of that things are different for every combination of outlook and exchange.I explain the procedure for Exchange 2010 SP1+ in combination with Outlook 2010.At first we want to get rid of all contact objects that contain [email protected] do this we need to be assigned to the RBAC role "Mailbox Import Export", which can be done through the Exchange Control panel.In my test environment I assigned this role to the Organization admins, but in real life you might want to add it to a custom role. Open the Exchange control panel by logging in to the ecp url, in my case https://ITFEX.itf.local/ECP, and make sure you selected your organization as management scope.Browse to Roles & Auditing, and open the properties for the organization management role group.click on the Add button to add a new role to the Organization Management role group, select the Mailbox Import Export role and click on add and OK to add it to the role.  Once you have assigned that role to your account you can open the Exchange Management Shell and execute the following command: Get-mailbox –resultsize unlimited | search-mailbox –targetmailbox "your.account" –targetfolder searchanddelete –loglevel full –logonly –searchquery "kind:contact AND [email protected]" This command will create a list with all mailboxes and any contacts that were found with an email address that contains [email protected], this list is then posted in the mailbox you specified at your.account in the folder searchanddelete.Now examine the report that was created and posted in the mailbox to see if it matches what you think it should match.My results looked like this:  When you're confident that the search includes all references and no false positives you can execute almost the same command, but this time with an delete action instead of the logonly. Get-mailbox –resultsize unlimited | search-mailbox –targetmailbox "your.account" –targetfolder searchanddelete –loglevel full –DeleteContent –searchquery "kind:contact AND [email protected]" Now most people would think this would remove the contact object from the suggested contacts, resulting in a removal from the autocomplete list.Sad but not true, to clean up the autocomplete list start Outlook with the command: "outlook /cleanautocompletecache" This will result in an empty cache, but luckily this is rebuild based on the suggested contacts, which now doesn't include the [email protected] contact anymore.

    Read the article

  • Setting up page goals in Analytics when using progressive enhancement to load content using jquery .load

    - by sam
    I'm using jQuery .load to load content in from other pages into my homepage, so that Google can still see whats going on I've made the <a> tags go to the pages but over ride them in the JavaScript so instead of going the that page it just loads in the content from that page to the main page. Normaly I would just make the page /contact.html a goal. Can I still get it to work as a goal if the content is being loaded in? Can I do something like when the user clicks <a href="contact.html" id="load-contact">contact</a> it logs the clicking of the <a> tag as a goal, rather than the actaul page being visited?

    Read the article

  • Hear about Oracle Supply Chain at Pella, Apr 27-29 '10

    - by [email protected]
    Oracle Customer Showcase - Apr 27-29'10 Featuring Pella Corp. Delivering Greater Customer Value "Discovering the Lean Value Chain" Pella is once again hosting Oracle customers at a mega-reference event in Pella, Iowa, on April 27-29. The agenda features a cross-stack set of topics and issues, including strategies for delivering customer value, improving the customer experience, Value Chain Planning / Manufacturing / Enterprise Performance Management, and Lean practices. Several executives will keynote, including Pella CIO Steve Printz. The event includes a demo grounds, round-table discussion groups, plant tours, and networking opportunities. !  

    Read the article

  • Email censorship system

    - by user1116589
    I would like to ask you about any censorship / moderation system. Basic workflow of events: Customer sends email to [email protected] from [email protected] ACME administrator receives notification and can moderate email After moderation administrator confirm an email and send it to [email protected] John answears to [email protected] Before the email is send it is moderated again by ACME administrator What is important, that this functionality is easy to do with some CMS/CMF systems. The problem is that we do not want to use an extra domain and force customer to login an extra system. Customer should only use his own email box or desktop email application. Thank you, Tomek

    Read the article

  • Oracle Enterprise Data Quality Adds Global Address Verification Capabilities for Greater Accuracy and Broader Location Coverage

    - by Mala Narasimharajan
    Data quality – has many flavors to it.  Product, Customer – you name the data domain and there’s data quality associated with it.  Address verification and data quality are a little different.  in that there is a tremendous amount of variation as well as nuance attached to it.  Specifically, what makes address verification challenging is that more often than not, addresses are incomplete, riddled with misspellings, incorrect postal codes are assigned to locations or non-address items are present.  Almost all data has locations, and accurate locations power a wealth of business processes: Customer Relationship Management, data quality, delivery of materials, goods or services, fraud detection, insurance risk assessment, data analytics, store and territory planning, and much more. Oracle Address Verification Server provides location-based services as well as deeper parsing and analysis capabilities for Oracle Enterprise Data Quality.  Specifically, Pre-integrated with the EDQ platform, Oracle Address Verification Server provides robust parsing, validation, as well as specialized location information for over 240 countries – all populated countries on Earth.  Oracle Enterprise Data Quality (EDQ) is a data quality platform, dedicated to address the distinct challenges of customer and product data quality, and performs advanced data profiling to identify and measure poor quality data and identify rule requirements, as well as semantic and pattern-based recognition to accurately parse and standardize data that is poorly structured.   EDQ is integrated with Oracle Master Data Management, including Oracle Customer Hub and Oracle Product Hub, as well as Oracle Data Integrator Enterprise Edition and Oracle CRM.  Address Verification Server provides key address verification services for Oracle CRM and Oracle Customer Hub.  In addition, Address Verification Server provides greater accuracy when handling address data due to its expanded sources and extensible knowledge repository, solid parsing across locales and countries as well as  adept handling of extraneous data in address fields.  For more information on Oracle Address Verification Server visit:  http://bit.ly/GMUE4H and http://bit.ly/GWf7U6

    Read the article

  • Why is testing MVC Views frowned upon?

    - by Peter Bernier
    I'm currently setting the groundwork for an ASP.Net MVC application and I'm looking into what sort of unit-tests I should be prepared to write. I've seen in multiple places people essentially saying 'don't bother testing your views, there's no logic and it's trivial and will be covered by an integration test'. I don't understand how this has become the accepted wisdom. Integration tests serve an entirely different purpose than unit tests. If I break something, I don't want to know a half-hour later when my integration tests break, I want to know immediately. Sample Scenario : Lets say we're dealing with a standard CRUD app with a Customer entity. The customer has a name and an address. At each level of testing, I want to verify that the Customer retrieval logic gets both the name and the address properly. To unit-test the repository, I write an integration test to hit the database. To unit-test the business rules, I mock out the repository, feed the business rules appropriate data, and verify my expected results are returned. What I'd like to do : To unit-test the UI, I mock out the business rules, setup my expected customer instance, render the view, and verify that the view contains the appropriate values for the instance I specified. What I'm stuck doing : To unit-test the repository, I write an integration test, setup an appropriate login, create the required data in the database, open a browser, navigate to the customer, and verify the resulting page contains the appropriate values for the instance I specified. I realize that there is overlap between the two scenarios discussed above, but the key difference it time and effort required to setup and execute the tests. If I (or another dev) removes the address field from the view, I don't want to wait for the integration test to discover this. I want is discovered and flagged in a unit-test that gets multiple times daily. I get the feeling that I'm just not grasping some key concept. Can someone explain why wanting immediate test feedback on the validity of an MVC view is a bad thing? (or if not bad, then not the expected way to get said feedback)

    Read the article

  • How to avoid email reply from my web site being marked as spam? [closed]

    - by Eric
    Possible Duplicate: How could I prevent my mail from being recognized as spam? Here's the situation: Customer fills out inquiry form on web site That inquiry goes to person X Person X goes to my web site (mysite.com) and presses some keys and the customer gets an email from [email protected] Here's my question: how can I be sure the email from [email protected] always gets through to the customer? Can I help it along by using SPF or some other secure email framework/solution? Thank you-- E

    Read the article

  • Distinct Count of Customers in a SCD Type 2 in #DAX

    - by Marco Russo (SQLBI)
    If you have a Slowly Changing Dimension (SCD) Type 2 for your customer and you want to calculate the number of distinct customers that bought a product, you cannot use the simple formula: Customers := DISTINCTCOUNT( FactTable[Customer Id] ) ) because it would return the number of distinct versions of customers. What you really want to do is to calculate the number of distinct application keys of the customers, that could be a lower number than the number you’ve got with the previous formula. Assuming that a Customer Code column in the Customers dimension contains the application key, you should use the following DAX formula: Customers := COUNTROWS( SUMMARIZE( FactTable, Customers[Customer Code] ) ) Be careful: only the version above is really fast, because it is solved by xVelocity (formerly known as VertiPaq) engine. Other formulas involving nested calculations might be more complex and move computation to the formula engine, resulting in slower query. This is absolutely an interesting pattern and I have to say it’s a killer feature. Try to do the same in Multidimensional…

    Read the article

  • What to "CRM" in San Francisco? CRM Highlights for OpenWorld '12

    - by Tony Berk
    There is plenty to SEE for CRM during OpenWorld in San Francisco, September 30 - October 4! As I mentioned in my earlier post about some of the keynote sessions, Is There a Cloud Over OpenWorld?, I'm going try to highlight some key sessions to help you find the best sessions for you. Interested to find out where Oracle CRM products are headed, then find your "roadmap" session. Here are some of the sessions in the CRM Track that you might want to consider attending for products you currently own or might consider for the future. I think you'll agree, there is quite a bit of investment going on across Oracle CRM. Please use OpenWorld Schedule Builder or check the OpenWorld Content Catalog for all of the session details and any time or location changes. Tip: Pre-enrolled session registrants via Schedule Builder are allowed into the session rooms before anyone else, so Schedule Builder will guarantee you a seat. Many of the sessions below will likely be at capacity. General Session: Oracle Fusion CRM—Improving Sales Effectiveness, Efficiency, and Ease of Use (Session ID: GEN9674) - Oct 2, 11:45 AM - 12:45 PM. Anthony Lye, Senior VP, Oracle leads this general session focused on Oracle Fusion CRM. Oracle Fusion CRM optimizes territories, combines quota management and incentive compensation, integrates sales and marketing, and cleanses and enriches data—all within a single application platform. Oracle Fusion can be configured, changed, and extended at runtime by end users, business managers, IT, and developers. Oracle Fusion CRM can be used from the Web, from a smartphone, from Microsoft Outlook, or from an iPad. Deloitte, sponsor of the CRM Track, will also present key concepts on CRM implementations. Oracle Fusion Customer Relationship Management: Overview/Strategy/Customer Experiences/Roadmap (CON9407) - Oct 1, 3:15PM - 4:15PM. In this session, learn how Oracle Fusion CRM enables companies to create better sales plans, generate more quality leads, and achieve higher win rates and find out why customers are adopting Oracle Fusion CRM. Gain a deeper understanding of the unique capabilities only Oracle Fusion CRM provides, and learn how Oracle’s commitment to CRM innovation is driving a wide range of future enhancements. Oracle RightNow CX Cloud Service Vision and Roadmap (CON9764) - Oct 1, 10:45 AM - 11:45 AM. Oracle RightNow CX Cloud Service combines Web, social, and contact center experiences for a unified, cross-channel service solution in the cloud, enabling organizations to increase sales and adoption, build trust, strengthen relationships, and reduce costs and effort. Come to this session to hear from Oracle experts about where the product is going and how Oracle is committed to accelerating the pace of innovation and value to its customers. Siebel CRM Overview, Strategy, and Roadmap (CON9700) - Oct 1, 12:15PM - 1:15PM. The world’s most complete CRM solution, Oracle’s Siebel CRM helps organizations differentiate their businesses. Come to this session to learn about the Siebel product roadmap and how Oracle is committed to accelerating the pace of innovation and value for its customers on this platform. Additionally, the session covers how Siebel customers can leverage many Oracle assets such as Oracle WebCenter Sites; InQuira, RightNow, and ATG/Endeca applications, and Oracle Policy Automation in conjunction with their current Siebel investments. Oracle Fusion Social CRM Strategy and Roadmap: Future of Collaboration and Social Engagement (CON9750) - Oct 4, 11:15 AM - 12:15 PM. Social is changing the customer experience! Come find out how Oracle can help you know your customers better, encourage brand affinity, and improve collaboration within your ecosystem. This session reviews Oracle’s social media solution and shows how you can discover hidden insights buried in your enterprise and social data. Also learn how Oracle Social Network revolutionizes how enterprise users work, collaborate, and share to achieve successful outcomes. Oracle CRM On Demand Strategy and Roadmap (CON9727) - Oct 1, 10:45AM - 11:45AM. Oracle CRM On Demand is a powerful cloud-based customer relationship management solution. Come to this session to learn directly from Oracle experts about future product plans and hear how Oracle is committed to accelerating the pace of innovation and value to its customers. Knowledge Management Roadmap and Strategy (CON9776) - Oct 1, 12:15PM - 1:15PM. Learn how to harness the knowledge created as a natural byproduct of day-to-day interactions to lower costs and improve customer experience by delivering the right answer at the right time across channels. This session includes an overview of Oracle’s product roadmap and vision for knowledge management for both the Oracle RightNow and Oracle Knowledge (formerly InQuira) product families. Oracle Policy Automation Roadmap: Supercharging the Customer Experience (CON9655) - Oct 1, 12:15PM - 1:15PM. Oracle Policy Automation delivers rapid customer value by streamlining the capture, analysis, and deployment of policies across every facet of the customer experience. This session discusses recent Oracle Policy Automation enhancements for policy analytics; the latest Oracle Policy Automation Connector for Siebel; and planned new capabilities, including availability with the Oracle RightNow product line. There is much more, so stay tuned for more highlights or check out the Content Catalog and search for your areas of interest. Which session are you most interested in? Make your suggestions! But no voting for Pearl Jam or Kings of Leon. Those are after hours! 

    Read the article

  • What You Can Learn from the NFL Referee Lockout

    - by Christina McKeon
    American football is a lot like religion. The fans are devoted followers that take brand loyalty to a whole new level. These fans that worship their teams each week showed that they are powerful customers whose voice has an impact. Yesterday, these fans proved that their opinion could force the hand of a large and powerful institution. With a three-month NFL referee lockout that seemed like it was nowhere close to resolution, the Green Bay Packers and the Seattle Seahawks competed last Monday night. For those of you that might have been out of the news cycle the past few days, Green Bay lost the game due to a controversial call that many experts and analysts agree should have resulted in Green Bay winning the game. Outrage ensued. The NFL had pulled replacement referees from the high school ranks, and these replacements did not have the knowledge and experience to handle high intensity NFL games. Fans protested about their customer experience. Their anger-filled rants were heard in social media, in the headlines of newspapers, on radio, and on national TV. Suddenly, the NFL was moved to reach an agreement with the referees. That agreement was reached late in the night on Wednesday with many believing that the referees had the upper hand forcing the owners into submission. Some might argue that the referees benefited, not the fans. Since the fans wanted qualified and competent referees, I would say the fans did benefit. The referees are scheduled to return to the field this Sunday, so the fans got what they wanted. What can you learn from this negative customer experience? Customers are in control. NFL owners thought they were controlling this situation with the upper hand over referees. The owners figured out they weren’t in control when their fans reacted negatively. Customers can make or break you more now than ever before, which is why it is more important to connect with them, engage them in a personal manner, and create rewarding relationships. Protect your brand. Whether knowingly or unknowingly, the NFL put their brand and each team’s brand at risk with replacement referees. Think about each business decision you make, and how it may impact your brand at different points in time. A decision that results in a gain today could result in a larger loss down the road. Customer experience matters. The NFL likely foresaw declining revenues in ticket sales, merchandising, advertising, and other areas if the lockout continued. While fans primarily spoke with their minds in the days following the Green Bay debacle, their wallets would be the next things to speak. Customer experience directly affects your success and is one of the few areas where you can differentiate your business. What would you do if your brand got such negative attention? Would you be prepared to navigate such stormy waters? Would you be able to prevent such a fiasco? If you don’t have a good answer to these questions, consider joining us October 3-5, 2012 at the Oracle Customer Experience Summit in San Francisco. You’ll have the opportunity to learn even more about customer experience from industry experts such as best-selling author Seth Godin, Paul Hagen and Kerry Bodine from Forrester Research, Inc., George Kembel from the Stanford d.School, Bruce Temkin of The Temkin Group, and Gene Alvarez from Gartner Inc.. There will also be plenty of your peers and customer experience experts available for networking and discussions.

    Read the article

  • how to update child records when updating the Master table using Linq [closed]

    - by user20358
    I currently use a general repositry class that can update only a single table like so public abstract class MyRepository<T> : IRepository<T> where T : class { protected IObjectSet<T> _objectSet; protected ObjectContext _context; public MyRepository(ObjectContext Context) { _objectSet = Context.CreateObjectSet<T>(); _context = Context; } public IQueryable<T> GetAll() { return _objectSet.AsQueryable(); } public IQueryable<T> Find(Expression<Func<T, bool>> filter) { return _objectSet.Where(filter); } public void Add(T entity) { _objectSet.AddObject(entity); _context.ObjectStateManager.ChangeObjectState(entity, System.Data.EntityState.Added); _context.SaveChanges(); } public void Update(T entity) { _context.ObjectStateManager.ChangeObjectState(entity, System.Data.EntityState.Modified); _context.SaveChanges(); } public void Delete(T entity) { _objectSet.Attach(entity); _context.ObjectStateManager.ChangeObjectState(entity, System.Data.EntityState.Deleted); _objectSet.DeleteObject(entity); _context.SaveChanges(); } } For every table class generated by my EDMX designer I create another class like this public class CustomerRepo : MyRepository<Customer> { public CustomerRepo (ObjectContext context) : base(context) { } } for any updates that I need to make to a particular table I do this: Customer CustomerObj = new Customer(); CustomerObj.Prop1 = ... CustomerObj.Prop2 = ... CustomerObj.Prop3 = ... CustomerRepo.Update(CustomerObj); This works perfectly well when I am updating just to the specific table called Customer. Now if I need to also update each row of another table which is a child of Customer called Orders what changes do I need to make to the class MyRepository. Orders table will have multiple records for a Customer record and multiple fields too, say for example Field1, Field2, Field3. So my questions are: 1.) If I only need to update Field1 of the Orders table for some rows based on a condition and Field2 for some other rows based on a different condition then what changes I need to do? 2.) If there is no such condition and all child rows need to be updated with the same value for all rows then what changes do I need to do? Thanks for taking the time. Look forward to your inputs...

    Read the article

  • CVE-2011-3256 Denial of Service (DoS) vulnerability in FreeType 2

    - by chandan
    CVE DescriptionCVSSv2 Base ScoreComponentProduct and Resolution CVE-2011-3256 Denial of Service (DoS) vulnerability 4.3 FreeType 2 Library Solaris 11 Contact Support Solaris 10 SPARC: 119812-13 X86: 119813-15 Solaris 9 Contact Support Solaris 8 Contact Support This notification describes vulnerabilities fixed in third-party components that are included in Sun's product distribution.Information about vulnerabilities affecting Oracle Sun products can be found on Oracle Critical Patch Updates and Security Alerts page.

    Read the article

  • Tack anchor link with Google Analytics

    - by Fredrik
    I have searched for how to track anchor links in analytics, but couldn't get it working. I have this code in the header: <script> (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){ (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o), m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) })(window,document,'script','//www.google-analytics.com/analytics.js','ga'); ga('_setAllowAnchor', true); ga('create', 'UA-*******-1', '****.com'); ga('send', 'pageview'); </script> And my links looks like this: <a href='#/contact'><span>Contact</span></a> I also tried to use this links: <a href='#/contact' onClick="_gaq.push(['_trackPageview', location.pathname+location.search+location.hash]);"><span>Contact</span></a> Is there any tips on what I can do?

    Read the article

< Previous Page | 46 47 48 49 50 51 52 53 54 55 56 57  | Next Page >