Search Results

Search found 9598 results on 384 pages for 'product rec'.

Page 48/384 | < Previous Page | 44 45 46 47 48 49 50 51 52 53 54 55  | Next Page >

  • Should I install an AV product on my domain controllers?

    - by mhud
    Should I run a server-specific antivirus, regular antivirus, or no antivirus at all on my servers, particularly my Domain Controllers? Here's some background about why I'm asking this question: I've never questioned that antivirus software should be running on all windows machines, period. Lately I've had some obscure Active Directory related issues that I have tracked down to antivirus software running on our domain controllers. The specific issue was that Symantec Endpoint Protection was running on all domain controllers. Occasionally, our Exchange server triggered a false-positive in Symantec's "Network Threat Protection" on each DC in sequence. After exhausting access to all DCs, Exchange began refusing requests, presumably because it could not communicate with any Global Catalog servers or perform any authentication. Outages would last about ten minutes at a time, and would occur once every few days. It took a long time to isolate the problem because it was not easily reproducible and generally investigation was done after the issue resolved itself.

    Read the article

  • Excel 2007 - combination of If and vlookup formula

    - by Neo
    i have a cell that refer to more than 1 worksheet and display the result (value) when it found the product from the 2 sheets. SheetA has 2 columns which column A is the value and column B is that product name, SheetB only has product name. Below is my formula but it failed to display result of product value, instead it always display Not Found even though the product is found from the sheets, is there anything wrong with the formula ? =IFERROR(VLOOKUP(A35,'SheetA'!A:B,1,FALSE),IFERROR(VLOOKUP(A35,'SheetB'!D19:D115,1,FALSE),"Not Found"))

    Read the article

  • Store product data in session variable or access db every time?

    - by JakeTheSnake
    I have my database storing lots of information about products (year, name, release_date, volume, etc. etc.). I currently load all of the products once and store them in a session variable - right now there's only 8 products but the list will grow as time progresses. The reason why I did this was to (perhaps foolishly) save HDD reads every time the products page was accessed. Am I shooting myself in the foot by storing this information in the session?

    Read the article

  • Please help me correct the small bugs in this image editor

    - by Alex
    Hi, I'm working on a website that will sell hand made jewelry and I'm finishing the image editor, but it's not behaving quite right. Basically, the user uploads an image which will be saved as a source and then it will be resized to fit the user's screen and saved as a temp. The user will then go to a screen that will allow them to crop the image and then save it to it's final versions. All of that works fine, except, the final versions have 3 bugs. First is some black horizontal line on the very bottom of the image. Second is an outline of sorts that follows the edges. I thought it was because I was reducing the quality, but even at 100% it still shows up... And lastly, I've noticed that the cropped image is always a couple of pixels lower than what I'm specifying... Anyway, I'm hoping someone whose got experience in editing images with C# can maybe take a look at the code and see where I might be going off the right path. Oh, by the way, this in an ASP.NET MVC application. Here's the code: using System; using System.Drawing; using System.Drawing.Drawing2D; using System.Drawing.Imaging; using System.IO; using System.Linq; using System.Web; namespace Website.Models.Providers { public class ImageProvider { private readonly ProductProvider ProductProvider = null; private readonly EncoderParameters HighQualityEncoder = new EncoderParameters(); private readonly ImageCodecInfo JpegCodecInfo = ImageCodecInfo.GetImageEncoders().Single( c => (c.MimeType == "image/jpeg")); private readonly string Path = HttpContext.Current.Server.MapPath("~/Resources/Images/Products"); private readonly short[][] Dimensions = new short[3][] { new short[2] { 640, 480 }, new short[2] { 240, 0 }, new short[2] { 80, 60 } }; ////////////////////////////////////////////////////////// // Constructor ////////////////////////////////////////// ////////////////////////////////////////////////////////// public ImageProvider( ProductProvider ProductProvider) { this.ProductProvider = ProductProvider; HighQualityEncoder.Param[0] = new EncoderParameter(Encoder.Quality, 100L); } ////////////////////////////////////////////////////////// // Crop ////////////////////////////////////////////// ////////////////////////////////////////////////////////// public void Crop( string FileName, Image Image, Crop Crop) { using (Bitmap Source = new Bitmap(Image)) { using (Bitmap Target = new Bitmap(Crop.Width, Crop.Height)) { using (Graphics Graphics = Graphics.FromImage(Target)) { Graphics.InterpolationMode = InterpolationMode.HighQualityBicubic; Graphics.SmoothingMode = SmoothingMode.HighQuality; Graphics.PixelOffsetMode = PixelOffsetMode.HighQuality; Graphics.CompositingQuality = CompositingQuality.HighQuality; Graphics.DrawImage(Source, new Rectangle(0, 0, Target.Width, Target.Height), new Rectangle(Crop.Left, Crop.Top, Crop.Width, Crop.Height), GraphicsUnit.Pixel); }; Target.Save(FileName, JpegCodecInfo, HighQualityEncoder); }; }; } ////////////////////////////////////////////////////////// // Crop & Resize ////////////////////////////////////// ////////////////////////////////////////////////////////// public void CropAndResize( Product Product, Crop Crop) { using (Image Source = Image.FromFile(String.Format("{0}/{1}.source", Path, Product.ProductId))) { using (Image Temp = Image.FromFile(String.Format("{0}/{1}.temp", Path, Product.ProductId))) { float Percent = ((float)Source.Width / (float)Temp.Width); short Width = (short)(Temp.Width * Percent); short Height = (short)(Temp.Height * Percent); Crop.Height = (short)(Crop.Height * Percent); Crop.Left = (short)(Crop.Left * Percent); Crop.Top = (short)(Crop.Top * Percent); Crop.Width = (short)(Crop.Width * Percent); Img Img = new Img(); this.ProductProvider.AddImageAndSave(Product, Img); this.Crop(String.Format("{0}/{1}.cropped", Path, Img.ImageId), Source, Crop); using (Image Cropped = Image.FromFile(String.Format("{0}/{1}.cropped", Path, Img.ImageId))) { this.Resize(this.Dimensions[0], String.Format("{0}/{1}-L.jpg", Path, Img.ImageId), Cropped, HighQualityEncoder); this.Resize(this.Dimensions[1], String.Format("{0}/{1}-T.jpg", Path, Img.ImageId), Cropped, HighQualityEncoder); this.Resize(this.Dimensions[2], String.Format("{0}/{1}-S.jpg", Path, Img.ImageId), Cropped, HighQualityEncoder); }; }; }; this.Purge(Product); } ////////////////////////////////////////////////////////// // Queue ////////////////////////////////////////////// ////////////////////////////////////////////////////////// public void QueueFor( Product Product, HttpPostedFileBase PostedFile) { using (Image Image = Image.FromStream(PostedFile.InputStream)) { this.Resize(new short[2] { 1152, 0 }, String.Format("{0}/{1}.temp", Path, Product.ProductId), Image, HighQualityEncoder); }; PostedFile.SaveAs(String.Format("{0}/{1}.source", Path, Product.ProductId)); } ////////////////////////////////////////////////////////// // Purge ////////////////////////////////////////////// ////////////////////////////////////////////////////////// private void Purge( Product Product) { string Source = String.Format("{0}/{1}.source", Path, Product.ProductId); string Temp = String.Format("{0}/{1}.temp", Path, Product.ProductId); if (File.Exists(Source)) { File.Delete(Source); }; if (File.Exists(Temp)) { File.Delete(Temp); }; foreach (Img Img in Product.Imgs) { string Cropped = String.Format("{0}/{1}.cropped", Path, Img.ImageId); if (File.Exists(Cropped)) { File.Delete(Cropped); }; }; } ////////////////////////////////////////////////////////// // Resize ////////////////////////////////////////////// ////////////////////////////////////////////////////////// public void Resize( short[] Dimensions, string FileName, Image Image, EncoderParameters EncoderParameters) { if (Dimensions[1] == 0) { Dimensions[1] = (short)(Image.Height / ((float)Image.Width / (float)Dimensions[0])); }; using (Bitmap Bitmap = new Bitmap(Dimensions[0], Dimensions[1])) { using (Graphics Graphics = Graphics.FromImage(Bitmap)) { Graphics.InterpolationMode = InterpolationMode.HighQualityBicubic; Graphics.SmoothingMode = SmoothingMode.HighQuality; Graphics.PixelOffsetMode = PixelOffsetMode.HighQuality; Graphics.CompositingQuality = CompositingQuality.HighQuality; Graphics.DrawImage(Image, 0, 0, Dimensions[0], Dimensions[1]); }; Bitmap.Save(FileName, JpegCodecInfo, EncoderParameters); }; } } } Here's one of the images this produces:

    Read the article

  • Is jQuery forcing Adobe ColdFusion to abandon the dead flash product line?

    - by crosenblum
    I have been reading a lot about how flash development/design had died, and as jQuery and in the near future html5 comes out, will this start to push Adobe/Coldfusion away from flash towards less product linking? I mean, I love coldfusion, and want that to continue to grow, however, if Adobe only bought Coldfusion from Macromedia, so they can bundle flash and coldfusion together, does the death of flash mean the death of coldfusion? http://topnews.us/content/221385-jobs-says-adobes-flash-waning-and-had-its-day http://aext.net/2010/03/javascript-jquery-killing-flash-tutorial-jquery-plugin/ I really don't mind if Flash dies, I do mind greatly if coldfusion does. Is the success of Flash linked to Coldfusion? If so, why? or why not? The purpose of this isn't to start some war about flash pro's and con's. I was only worried that Adobe would cause problems for Coldfusion, if flash had some market/financial problems. That was my main concern... And no I am not anti-flash... But my financial sanity depends on Coldfusion being a success, so that is why I stated my question. Because I WANT EVERYONE ELSE'S OPINION OF THIS SITUATION. Thank You.

    Read the article

  • Large Product catalog with statistics - alternatives to Sql Server?

    - by Eric P
    I am building UI for a large product catalog (millions of products). I am using Sql Server, FreeText search and ASP.NET MVC. Tables are normalized and indexed. Most queries take less then a second to return. The issue is this. Let's say user does the search by keyword. On search results page I need to display/query for: First 20 matching products (paged, sorted) Total count of matching products for paging List of stores only of matching products List of brands only of matching products List of colors only of matching products Each query takes about .5 to 1 seconds. Altogether it is like 5 seconds. I would like to get the whole page to load under 1 second. There are several approaches: Optimize queries even more. I already spent a lot of time on this one, so not sure it can be pushed further. Load products first, then load the rest of the information using AJAX. More like a workaround. Will need to revise UI. Re-organize data to be more Report friendly. Already aggregated a lot of fields. I checked out several similar sites. For ex. zappos.com. Not only they display the same information as I would like in under 1 second, but they also include statistics (number of results in each category). The following is the search for keyword "white" http://www.zappos.com/white How do sites like zappos, amazon make their results, filters and stats appear almost instantly?

    Read the article

  • What would be a good Database strategy to manage these two product options?

    - by bemused
    I have a site that allows users to purchase "items" (imagine it as an Advertisement, or a download). There are 2 ways to purchase. Either a subscription, 70 items within 1 month (use them or lose them--at the end of the month your count is 0) or purchase each item individually as you need it. So the user could subscribe and get 70/month or pay for 10 and use them when they want until the 10 are gone. Maybe it's the late hour, but I can't isolate a solution I like and thought some users here would surely have stumbled upon something similar. One I can imagine is webhosts. They sell hosting for monthy fees and sell counts of things like you get 5 free domains with our reseller account. or something like a movie download site, you can subscribe and get 100 movies each month, or pay for a one-time package of 10 movies. so is this a web of tables and where would be a good cross between the product a user has purchased and how many they have left? products productID, productType=subscription, consumable, subscription&consumable subscriptions SubscriptionID, subscriptionStartDate, subscriptionEndDate, consumables consumableID, consumableName UserProducts userID,productID,productType ,consumptionLimit,consumedCount (if subscription check against dates), otherwise just check that consumedCount is < than limit. Usually I can layout my data in a way that I know it will work the way I expect, but this one feels a little questionable to me. Like there is a hidden detail that is going to creep up later. That's why I decided to ask for help if someone in the vast expanse can enlighten me with their wisdom and experience and clue me in to a satisfying strategy. Thank you.

    Read the article

  • Partial view is not rendering within the main view it's contained (instead it's rendered in it's own page)?

    - by JaJ
    I have a partial view that is contained in a simple index view. When I try to add a new object to my model and update my partial view to display that new object along with existing objects the partial view is rendered outside the page that it's contained? I'm using AJAX to update the partial view but what is wrong with the following code? Model: public class Product { public int ID { get; set; } public string Name { get; set; } [DataType(DataType.Currency)] public decimal Price { get; set; } } public class BoringStoreContext { List<Product> results = new List<Product>(); public BoringStoreContext() { Products = new List<Product>(); Products.Add(new Product() { ID = 1, Name = "Sure", Price = (decimal)(1.10) }); Products.Add(new Product() { ID = 2, Name = "Sure2", Price = (decimal)(2.10) }); } public List<Product> Products {get; set;} } public class ProductIndexViewModel { public Product NewProduct { get; set; } public IEnumerable<Product> Products { get; set; } } Index.cshtml View: @model AjaxPartialPageUpdates.Models.ProductIndexViewModel @using (Ajax.BeginForm("Index_AddItem", new AjaxOptions { UpdateTargetId = "productList" })) { <div> @Html.LabelFor(model => model.NewProduct.Name) @Html.EditorFor(model => model.NewProduct.Name) </div> <div> @Html.LabelFor(model => model.NewProduct.Price) @Html.EditorFor(model => model.NewProduct.Price) </div> <div> <input type="submit" value="Add Product" /> </div> } <div id='productList'> @{ Html.RenderPartial("ProductListControl", Model.Products); } </div> ProductListControl.cshtml Partial View @model IEnumerable<AjaxPartialPageUpdates.Models.Product> <table> <!-- Render the table headers. --> <tr> <th>Name</th> <th>Price</th> </tr> <!-- Render the name and price of each product. --> @foreach (var item in Model) { <tr> <td>@Html.DisplayFor(model => item.Name)</td> <td>@Html.DisplayFor(model => item.Price)</td> </tr> } </table> Controller: public class HomeController : Controller { public ActionResult Index() { BoringStoreContext db = new BoringStoreContext(); ProductIndexViewModel viewModel = new ProductIndexViewModel { NewProduct = new Product(), Products = db.Products }; return View(viewModel); } public ActionResult Index_AddItem(ProductIndexViewModel viewModel) { BoringStoreContext db = new BoringStoreContext(); db.Products.Add(viewModel.NewProduct); return PartialView("ProductListControl", db.Products); } }

    Read the article

  • WPF ListView as a DataGrid – Part 2

    - by psheriff
    In my last blog post I showed you how to create GridViewColumn objects on the fly from the meta-data in a DataTable. By doing this you can create columns for a ListView at runtime instead of having to pre-define each ListView for each different DataTable. Well, many of us use collections of our classes and it would be nice to be able to do the same thing for our collection classes as well. This blog post will show you one approach for using collection classes as the source of the data for your ListView.  Figure 1: A List of Data using a ListView Load Property NamesYou could use reflection to gather the property names in your class, however there are two things wrong with this approach. First, reflection is too slow, and second you may not want to display all your properties from your class in the ListView. Instead of reflection you could just create your own custom collection class of PropertyHeader objects. Each PropertyHeader object will contain a property name and a header text value at a minimum. You could add a width property if you wanted as well. All you need to do is to create a collection of property header objects where each object represents one column in your ListView. Below is a simple example: PropertyHeaders coll = new PropertyHeaders(); coll.Add(new PropertyHeader("ProductId", "Product ID"));coll.Add(new PropertyHeader("ProductName", "Product Name"));coll.Add(new PropertyHeader("Price", "Price")); Once you have this collection created, you could pass this collection to a method that would create the GridViewColumn objects based on the information in this collection. Below is the full code for the PropertyHeader class. Besides the PropertyName and Header properties, there is a constructor that will allow you to set both properties when the object is created. C#public class PropertyHeader{  public PropertyHeader()  {  }   public PropertyHeader(string propertyName, string headerText)  {    PropertyName = propertyName;    HeaderText = headerText;  }   public string PropertyName { get; set; }  public string HeaderText { get; set; }} VB.NETPublic Class PropertyHeader  Public Sub New()  End Sub   Public Sub New(ByVal propName As String, ByVal header As String)    PropertyName = propName    HeaderText = header  End Sub   Private mPropertyName As String  Private mHeaderText As String   Public Property PropertyName() As String    Get      Return mPropertyName    End Get    Set(ByVal value As String)      mPropertyName = value    End Set  End Property   Public Property HeaderText() As String    Get      Return mHeaderText    End Get    Set(ByVal value As String)      mHeaderText = value    End Set  End PropertyEnd Class You can use a Generic List class to create a collection of PropertyHeader objects as shown in the following code. C#public class PropertyHeaders : List<PropertyHeader>{} VB.NETPublic Class PropertyHeaders  Inherits List(Of PropertyHeader)End Class Create Property Header Objects You need to create a method somewhere that will create and return a collection of PropertyHeader objects that will represent the columns you wish to add to your ListView prior to binding your collection class to that ListView. Below is a sample method called GetProperties that builds a list of PropertyHeader objects with properties and headers for a Product object. C#public PropertyHeaders GetProperties(){  PropertyHeaders coll = new PropertyHeaders();   coll.Add(new PropertyHeader("ProductId", "Product ID"));  coll.Add(new PropertyHeader("ProductName", "Product Name"));  coll.Add(new PropertyHeader("Price", "Price"));   return coll;} VB.NETPublic Function GetProperties() As PropertyHeaders  Dim coll As New PropertyHeaders()   coll.Add(New PropertyHeader("ProductId", "Product ID"))  coll.Add(New PropertyHeader("ProductName", "Product Name"))  coll.Add(New PropertyHeader("Price", "Price"))   Return collEnd Function WPFListViewCommon Class Now that you have a collection of PropertyHeader objects you need a method that will create a GridView and a collection of GridViewColumn objects based on this PropertyHeader collection. Below is a static/Shared method that you might put into a class called WPFListViewCommon. C#public static GridView CreateGridViewColumns(  PropertyHeaders properties){  GridView gv;  GridViewColumn gvc;   // Create the GridView  gv = new GridView();  gv.AllowsColumnReorder = true;   // Create the GridView Columns  foreach (PropertyHeader item in properties)  {    gvc = new GridViewColumn();    gvc.DisplayMemberBinding = new Binding(item.PropertyName);    gvc.Header = item.HeaderText;    gvc.Width = Double.NaN;    gv.Columns.Add(gvc);  }   return gv;} VB.NETPublic Shared Function CreateGridViewColumns( _    ByVal properties As PropertyHeaders) As GridView  Dim gv As GridView  Dim gvc As GridViewColumn   ' Create the GridView  gv = New GridView()  gv.AllowsColumnReorder = True   ' Create the GridView Columns  For Each item As PropertyHeader In properties    gvc = New GridViewColumn()    gvc.DisplayMemberBinding = New Binding(item.PropertyName)    gvc.Header = item.HeaderText    gvc.Width = [Double].NaN    gv.Columns.Add(gvc)  Next   Return gvEnd Function Build the Product Screen To build the window shown in Figure 1, you might write code like the following: C#private void CollectionSample(){  Product prod = new Product();   // Setup the GridView Columns  lstData.View = WPFListViewCommon.CreateGridViewColumns(       prod.GetProperties());  lstData.DataContext = prod.GetProducts();} VB.NETPrivate Sub CollectionSample()  Dim prod As New Product()   ' Setup the GridView Columns  lstData.View = WPFListViewCommon.CreateGridViewColumns( _       prod.GetProperties())  lstData.DataContext = prod.GetProducts()End Sub The Product class contains a method called GetProperties that returns a PropertyHeaders collection. You pass this collection to the WPFListViewCommon’s CreateGridViewColumns method and it will create a GridView for the ListView. When you then feed the DataContext property of the ListView the Product collection the appropriate columns have already been created and data bound. Summary In this blog you learned how to create a ListView that acts like a DataGrid using a collection class. While it does take a little code to do this, it is an alternative to creating each GridViewColumn in XAML. This gives you a lot of flexibility. You could even read in the property names and header text from an XML file for a truly configurable ListView. NOTE: You can download the complete sample code (in both VB and C#) at my website. http://www.pdsa.com/downloads. Choose Tips & Tricks, then "WPF ListView as a DataGrid – Part 2" from the drop-down. Good Luck with your Coding,Paul Sheriff ** SPECIAL OFFER FOR MY BLOG READERS **Visit http://www.pdsa.com/Event/Blog for a free eBook on "Fundamentals of N-Tier".  

    Read the article

  • Combined Likelihood Models

    - by Lukas Vermeer
    In a series of posts on this blog we have already described a flexible approach to recording events, a technique to create analytical models for reporting, a method that uses the same principles to generate extremely powerful facet based predictions and a waterfall strategy that can be used to blend multiple (possibly facet based) models for increased accuracy. This latest, and also last, addition to this sequence of increasing modeling complexity will illustrate an advanced approach to amalgamate models, taking us to a whole new level of predictive modeling and analytical insights; combination models predicting likelihoods using multiple child models. The method described here is far from trivial. We therefore would not recommend you apply these techniques in an initial implementation of Oracle Real-Time Decisions. In most cases, basic RTD models or the approaches described before will provide more than enough predictive accuracy and analytical insight. The following is intended as an example of how more advanced models could be constructed if implementation results warrant the increased implementation and design effort. Keep implemented statistics simple! Combining likelihoods Because facet based predictions are based on metadata attributes of the choices selected, it is possible to generate such predictions for more than one attribute of a choice. We can predict the likelihood of acceptance for a particular product based on the product category (e.g. ‘toys’), as well as based on the color of the product (e.g. ‘pink’). Of course, these two predictions may be completely different (the customer may well prefer toys, but dislike pink products) and we will have to somehow combine these two separate predictions to determine an overall likelihood of acceptance for the choice. Perhaps the simplest way to combine multiple predicted likelihoods into one is to calculate the average (or perhaps maximum or minimum) likelihood. However, this would completely forgo the fact that some facets may have a far more pronounced effect on the overall likelihood than others (e.g. customers may consider the product category more important than its color). We could opt for calculating some sort of weighted average, but this would require us to specify up front the relative importance of the different facets involved. This approach would also be unresponsive to changing consumer behavior in these preferences (e.g. product price bracket may become more important to consumers as a result of economic shifts). Preferably, we would want Oracle Real-Time Decisions to learn, act upon and tell us about, the correlations between the different facet models and the overall likelihood of acceptance. This additional level of predictive modeling, where a single supermodel (no pun intended) combines the output of several (facet based) models into a single prediction, is what we call a combined likelihood model. Facet Based Scores As an example, we have implemented three different facet based models (as described earlier) in a simple RTD inline service. These models will allow us to generate predictions for likelihood of acceptance for each product based on three different metadata fields: Category, Price Bracket and Product Color. We will use an Analytical Scores entity to store these different scores so we can easily pass them between different functions. A simple function, creatively named Compute Analytical Scores, will compute for each choice the different facet scores and return an Analytical Scores entity that is stored on the choice itself. For each score, a choice attribute referring to this entity is also added to be returned to the client to facilitate testing. One Offer To Predict Them All In order to combine the different facet based predictions into one single likelihood for each product, we will need a supermodel which can predict the likelihood of acceptance, based on the outcomes of the facet models. This model will not need to consider any of the attributes of the session, because they are already represented in the outcomes of the underlying facet models. For the same reason, the supermodel will not need to learn separately for each product, because the specific combination of facets for this product are also already represented in the output of the underlying models. In other words, instead of learning how session attributes influence acceptance of a particular product, we will learn how the outcomes of facet based models for a particular product influence acceptance at a higher level. We will therefore be using a single All Offers choice to represent all offers in our combined likelihood predictions. This choice has no attribute values configured, no scores and not a single eligibility rule; nor is it ever intended to be returned to a client. The All Offers choice is to be used exclusively by the Combined Likelihood Acceptance model to predict the likelihood of acceptance for all choices; based solely on the output of the facet based models defined earlier. The Switcheroo In Oracle Real-Time Decisions, models can only learn based on attributes stored on the session. Therefore, just before generating a combined prediction for a given choice, we will temporarily copy the facet based scores—stored on the choice earlier as an Analytical Scores entity—to the session. The code for the Predict Combined Likelihood Event function is outlined below. // set session attribute to contain facet based scores. // (this is the only input for the combined model) session().setAnalyticalScores(choice.getAnalyticalScores); // predict likelihood of acceptance for All Offers choice. CombinedLikelihoodChoice c = CombinedLikelihood.getChoice("AllOffers"); Double la = CombinedLikelihoodAcceptance.getChoiceEventLikelihoods(c, "Accepted"); // clear session attribute of facet based scores. session().setAnalyticalScores(null); // return likelihood. return la; This sleight of hand will allow the Combined Likelihood Acceptance model to predict the likelihood of acceptance for the All Offers choice using these choice specific scores. After the prediction is made, we will clear the Analytical Scores session attribute to ensure it does not pollute any of the other (facet) models. To guarantee our combined likelihood model will learn based on the facet based scores—and is not distracted by the other session attributes—we will configure the model to exclude any other inputs, save for the instance of the Analytical Scores session attribute, on the model attributes tab. Recording Events In order for the combined likelihood model to learn correctly, we must ensure that the Analytical Scores session attribute is set correctly at the moment RTD records any events related to a particular choice. We apply essentially the same switching technique as before in a Record Combined Likelihood Event function. // set session attribute to contain facet based scores // (this is the only input for the combined model). session().setAnalyticalScores(choice.getAnalyticalScores); // record input event against All Offers choice. CombinedLikelihood.getChoice("AllOffers").recordEvent(event); // force learn at this moment using the Internal Dock entry point. Application.getPredictor().learn(InternalLearn.modelArray, session(), session(), Application.currentTimeMillis()); // clear session attribute of facet based scores. session().setAnalyticalScores(null); In this example, Internal Learn is a special informant configured as the learn location for the combined likelihood model. The informant itself has no particular configuration and does nothing in itself; it is used only to force the model to learn at the exact instant we have set the Analytical Scores session attribute to the correct values. Reporting Results After running a few thousand (artificially skewed) simulated sessions on our ILS, the Decision Center reporting shows some interesting results. In this case, these results reflect perfectly the bias we ourselves had introduced in our tests. In practice, we would obviously use a wider range of customer attributes and expect to see some more unexpected outcomes. The facetted model for categories has clearly picked up on the that fact our simulated youngsters have little interest in purchasing the one red-hot vehicle our ILS had on offer. Also, it would seem that customer age is an excellent predictor for the acceptance of pink products. Looking at the key drivers for the All Offers choice we can see the relative importance of the different facets to the prediction of overall likelihood. The comparative importance of the category facet for overall prediction might, in part, be explained by the clear preference of younger customers for toys over other product types; as evident from the report on the predictiveness of customer age for offer category acceptance. Conclusion Oracle Real-Time Decisions' flexible decisioning framework allows for the construction of exceptionally elaborate prediction models that facilitate powerful targeting, but nonetheless provide insightful reporting. Although few customers will have a direct need for such a sophisticated solution architecture, it is encouraging to see that this lies within the realm of the possible with RTD; and this with limited configuration and customization required. There are obviously numerous other ways in which the predictive and reporting capabilities of Oracle Real-Time Decisions can be expanded upon to tailor to individual customers needs. We will not be able to elaborate on them all on this blog; and finding the right approach for any given problem is often more difficult than implementing the solution. Nevertheless, we hope that these last few posts have given you enough of an understanding of the power of the RTD framework and its models; so that you can take some of these ideas and improve upon your own strategy. As always, if you have any questions about the above—or any Oracle Real-Time Decisions design challenges you might face—please do not hesitate to contact us; via the comments below, social media or directly at Oracle. We are completely multi-channel and would be more than glad to help. :-)

    Read the article

  • Creating generic list of instances of a class.

    - by Jim Branson
    I have several projects where I build a dictionary from a small class. I'm using C# 2008, Visual studio 2008 and .net 3.5 This is the code: namespace ReportsTest { class Junk { public static Dictionary<string, string> getPlatKeys() { Dictionary<string, string> retPlatKeys = new Dictionary<string, string>(); SqlConnection conn = new SqlConnection("Data Source=JB55LTARL;Initial Catalog=HldiReports;Integrated Security=True"); SqlDataReader Dr = null; conn.Open(); SqlCommand cmnd = new SqlCommand("SELECT Make, Series, RedesignYear, SeriesName FROM CompPlatformKeys", conn); Dr = cmnd.ExecuteReader(); while (Dr.Read()) { utypPlatKeys rec = new utypPlatKeys(Dr); retPlatKeys.Add(rec.Make + rec.Series + rec.RedesignYear, rec.SeriesName); } conn = null; Dr = null; return retPlatKeys; } } public class utypPlatKeys { public string Make { get; set; } public string Series { get; set; } public string RedesignYear { get; set; } public string SeriesName { get; set; } public utypPlatKeys(SqlDataReader dr) { this.Make = dr.GetInt16(dr.GetOrdinal("Make")).ToString("D3"); this.Series = dr.GetInt16(dr.GetOrdinal("Series")).ToString("D3"); this.RedesignYear = dr.GetInt16(dr.GetOrdinal("RedesignYear")).ToString(); this.SeriesName = dr["SeriesName"].ToString(); } } } The immediate window shows all of the entries in retPlatKeys and if you hover over retPlatKeys after loading it indicates the number of elements like this: "retPlatKeys| Count = 923 which is correct. I went to create a new project using this pattern only now the immediate window says retPlatKeys is out of scope and hovering over retPlatKeys after loading I get something like retPlatKeys|0x0000000002578900. Any help is greatly appreciated.

    Read the article

  • Extracting specific nodes from XML using XML::Twig

    - by pratz
    i was trying to extract a particular set of nodes from the following XML structure using XML::Twig, but have been stuck ever since. I need to extract the 'player' nodes from the following structure and do a string match/replace on each of these node values. <pep:record> <agency> <subrecord type="scout"> <isnum>123XXX (print)</isnum> <isnum>234YYY (mag)</isnum> </subrecord> <subrecord type="group"> </subrecord> </agency </record> I tried using the following code, but I get pointed to a hash reference rather than actual string. my $parser = XML::Twig->new(twig_handlers => { isnum => sub { print $_->text."::" }, }); foreach my $rec (split(/::/, $parser->parse($my_xml))) { if ($rec =~ m/print/) { ($print = $rec) =~ s/( \(print\))//; } elsif($rec =~ m/mag/) { ($mag = $rec) =~ s/( \(mag\))//; } }

    Read the article

  • Combinations and Permutations in F#

    - by Noldorin
    I've recently written the following combinations and permutations functions for an F# project, but I'm quite aware they're far from optimised. /// Rotates a list by one place forward. let rotate lst = List.tail lst @ [List.head lst] /// Gets all rotations of a list. let getRotations lst = let rec getAll lst i = if i = 0 then [] else lst :: (getAll (rotate lst) (i - 1)) getAll lst (List.length lst) /// Gets all permutations (without repetition) of specified length from a list. let rec getPerms n lst = match n, lst with | 0, _ -> seq [[]] | _, [] -> seq [] | k, _ -> lst |> getRotations |> Seq.collect (fun r -> Seq.map ((@) [List.head r]) (getPerms (k - 1) (List.tail r))) /// Gets all permutations (with repetition) of specified length from a list. let rec getPermsWithRep n lst = match n, lst with | 0, _ -> seq [[]] | _, [] -> seq [] | k, _ -> lst |> Seq.collect (fun x -> Seq.map ((@) [x]) (getPermsWithRep (k - 1) lst)) // equivalent: | k, _ -> lst |> getRotations |> Seq.collect (fun r -> List.map ((@) [List.head r]) (getPermsWithRep (k - 1) r)) /// Gets all combinations (without repetition) of specified length from a list. let rec getCombs n lst = match n, lst with | 0, _ -> seq [[]] | _, [] -> seq [] | k, (x :: xs) -> Seq.append (Seq.map ((@) [x]) (getCombs (k - 1) xs)) (getCombs k xs) /// Gets all combinations (with repetition) of specified length from a list. let rec getCombsWithRep n lst = match n, lst with | 0, _ -> seq [[]] | _, [] -> seq [] | k, (x :: xs) -> Seq.append (Seq.map ((@) [x]) (getCombsWithRep (k - 1) lst)) (getCombsWithRep k xs) Does anyone have any suggestions for how these functions (algorithms) can be sped up? I'm particularly interested in how the permutation (with and without repetition) ones can be improved. The business involving rotations of lists doesn't look too efficient to me in retrospect. Update Here's my new implementation for the getPerms function, inspired by Tomas's answer. Unfortunately, it's not really any fast than the existing one. Suggestions? let getPerms n lst = let rec getPermsImpl acc n lst = seq { match n, lst with | k, x :: xs -> if k > 0 then for r in getRotations lst do yield! getPermsImpl (List.head r :: acc) (k - 1) (List.tail r) if k >= 0 then yield! getPermsImpl acc k [] | 0, [] -> yield acc | _, [] -> () } getPermsImpl List.empty n lst

    Read the article

  • what wrong in the program using tie and array

    - by SCNCN2010
    File : #comment1 #comment2 #comment3 #START HERE a: [email protected] b: [email protected] my perl program : use Data::Dumper; use Tie::File; tie my @array, 'Tie::File', 'ala.txt' or die $!; my $rec = 'p: [email protected]'; my $flag =1 ; my $add_flag = 0; for my $i (0..$#array) { next if ($array[$i] =~ /^\s*$/); if ( $flag == 1 ) { if ($array[$i] =~ /#START HERE/ ) { $flag = 0; } else { next ; } } if (($array[$i] cmp $rec) == 1) { splice @array, $i, 1, $rec; $add_flag = 1; last ; } } if ( $add_flag == 0 ) { my $index = $#array+1; $array[$index] = $rec ; } the recording adding end of file always . I am trying to add to middle or begin or end like aplphbetical order

    Read the article

  • F# - Facebook Hacker Cup - Double Squares

    - by Jacob
    I'm working on strengthening my F#-fu and decided to tackle the Facebook Hacker Cup Double Squares problem. I'm having some problems with the run-time and was wondering if anyone could help me figure out why it is so much slower than my C# equivalent. There's a good description from another post; Source: Facebook Hacker Cup Qualification Round 2011 A double-square number is an integer X which can be expressed as the sum of two perfect squares. For example, 10 is a double-square because 10 = 3^2 + 1^2. Given X, how can we determine the number of ways in which it can be written as the sum of two squares? For example, 10 can only be written as 3^2 + 1^2 (we don't count 1^2 + 3^2 as being different). On the other hand, 25 can be written as 5^2 + 0^2 or as 4^2 + 3^2. You need to solve this problem for 0 = X = 2,147,483,647. Examples: 10 = 1 25 = 2 3 = 0 0 = 1 1 = 1 My basic strategy (which I'm open to critique on) is to; Create a dictionary (for memoize) of the input numbers initialzed to 0 Get the largest number (LN) and pass it to count/memo function Get the LN square root as int Calculate squares for all numbers 0 to LN and store in dict Sum squares for non repeat combinations of numbers from 0 to LN If sum is in memo dict, add 1 to memo Finally, output the counts of the original numbers. Here is the F# code (See code changes at bottom) I've written that I believe corresponds to this strategy (Runtime: ~8:10); open System open System.Collections.Generic open System.IO /// Get a sequence of values let rec range min max = seq { for num in [min .. max] do yield num } /// Get a sequence starting from 0 and going to max let rec zeroRange max = range 0 max /// Find the maximum number in a list with a starting accumulator (acc) let rec maxNum acc = function | [] -> acc | p::tail when p > acc -> maxNum p tail | p::tail -> maxNum acc tail /// A helper for finding max that sets the accumulator to 0 let rec findMax nums = maxNum 0 nums /// Build a collection of combinations; ie [1,2,3] = (1,1), (1,2), (1,3), (2,2), (2,3), (3,3) let rec combos range = seq { let count = ref 0 for inner in range do for outer in Seq.skip !count range do yield (inner, outer) count := !count + 1 } let rec squares nums = let dict = new Dictionary<int, int>() for s in nums do dict.[s] <- (s * s) dict /// Counts the number of possible double squares for a given number and keeps track of other counts that are provided in the memo dict. let rec countDoubleSquares (num: int) (memo: Dictionary<int, int>) = // The highest relevent square is the square root because it squared plus 0 squared is the top most possibility let maxSquare = System.Math.Sqrt((float)num) // Our relevant squares are 0 to the highest possible square; note the cast to int which shouldn't hurt. let relSquares = range 0 ((int)maxSquare) // calculate the squares up front; let calcSquares = squares relSquares // Build up our square combinations; ie [1,2,3] = (1,1), (1,2), (1,3), (2,2), (2,3), (3,3) for (sq1, sq2) in combos relSquares do let v = calcSquares.[sq1] + calcSquares.[sq2] // Memoize our relevant results if memo.ContainsKey(v) then memo.[v] <- memo.[v] + 1 // return our count for the num passed in memo.[num] // Read our numbers from file. //let lines = File.ReadAllLines("test2.txt") //let nums = [ for line in Seq.skip 1 lines -> Int32.Parse(line) ] // Optionally, read them from straight array let nums = [1740798996; 1257431873; 2147483643; 602519112; 858320077; 1048039120; 415485223; 874566596; 1022907856; 65; 421330820; 1041493518; 5; 1328649093; 1941554117; 4225; 2082925; 0; 1; 3] // Initialize our memoize dictionary let memo = new Dictionary<int, int>() for num in nums do memo.[num] <- 0 // Get the largest number in our set, all other numbers will be memoized along the way let maxN = findMax nums // Do the memoize let maxCount = countDoubleSquares maxN memo // Output our results. for num in nums do printfn "%i" memo.[num] // Have a little pause for when we debug let line = Console.Read() And here is my version in C# (Runtime: ~1:40: using System; using System.Collections.Generic; using System.Diagnostics; using System.IO; using System.Linq; using System.Text; namespace FBHack_DoubleSquares { public class TestInput { public int NumCases { get; set; } public List<int> Nums { get; set; } public TestInput() { Nums = new List<int>(); } public int MaxNum() { return Nums.Max(); } } class Program { static void Main(string[] args) { // Read input from file. //TestInput input = ReadTestInput("live.txt"); // As example, load straight. TestInput input = new TestInput { NumCases = 20, Nums = new List<int> { 1740798996, 1257431873, 2147483643, 602519112, 858320077, 1048039120, 415485223, 874566596, 1022907856, 65, 421330820, 1041493518, 5, 1328649093, 1941554117, 4225, 2082925, 0, 1, 3, } }; var maxNum = input.MaxNum(); Dictionary<int, int> memo = new Dictionary<int, int>(); foreach (var num in input.Nums) { if (!memo.ContainsKey(num)) memo.Add(num, 0); } DoMemoize(maxNum, memo); StringBuilder sb = new StringBuilder(); foreach (var num in input.Nums) { //Console.WriteLine(memo[num]); sb.AppendLine(memo[num].ToString()); } Console.Write(sb.ToString()); var blah = Console.Read(); //File.WriteAllText("out.txt", sb.ToString()); } private static int DoMemoize(int num, Dictionary<int, int> memo) { var highSquare = (int)Math.Floor(Math.Sqrt(num)); var squares = CreateSquareLookup(highSquare); var relSquares = squares.Keys.ToList(); Debug.WriteLine("Starting - " + num.ToString()); Debug.WriteLine("RelSquares.Count = {0}", relSquares.Count); int sum = 0; var index = 0; foreach (var square in relSquares) { foreach (var inner in relSquares.Skip(index)) { sum = squares[square] + squares[inner]; if (memo.ContainsKey(sum)) memo[sum]++; } index++; } if (memo.ContainsKey(num)) return memo[num]; return 0; } private static TestInput ReadTestInput(string fileName) { var lines = File.ReadAllLines(fileName); var input = new TestInput(); input.NumCases = int.Parse(lines[0]); foreach (var lin in lines.Skip(1)) { input.Nums.Add(int.Parse(lin)); } return input; } public static Dictionary<int, int> CreateSquareLookup(int maxNum) { var dict = new Dictionary<int, int>(); int square; foreach (var num in Enumerable.Range(0, maxNum)) { square = num * num; dict[num] = square; } return dict; } } } Thanks for taking a look. UPDATE Changing the combos function slightly will result in a pretty big performance boost (from 8 min to 3:45): /// Old and Busted... let rec combosOld range = seq { let rangeCache = Seq.cache range let count = ref 0 for inner in rangeCache do for outer in Seq.skip !count rangeCache do yield (inner, outer) count := !count + 1 } /// The New Hotness... let rec combos maxNum = seq { for i in 0..maxNum do for j in i..maxNum do yield i,j }

    Read the article

  • How can I create the XML::Simple data structure using a Perl XML SAX parser?

    - by DVK
    Summary: I am looking a fast XML parser (most likely a wrapper around some standard SAX parser) which will produce per-record data structure 100% identical to those produced by XML::Simple. Details: We have a large code infrastructure which depends on processing records one-by-one and expects the record to be a data structure in a format produced by XML::Simple since it always used XML::Simple since early Jurassic era. An example simple XML is: <root> <rec><f1>v1</f1><f2>v2</f2></rec> <rec><f1>v1b</f1><f2>v2b</f2></rec> <rec><f1>v1c</f1><f2>v2c</f2></rec> </root> And example rough code is: sub process_record { my ($obj, $record_hash) = @_; # do_stuff } my $records = XML::Simple->XMLin(@args)->{root}; foreach my $record (@$records) { $obj->process_record($record) }; As everyone knows XML::Simple is, well, simple. And more importantly, it is very slow and a memory hog—due to being a DOM parser and needing to build/store 100% of data in memory. So, it's not the best tool for parsing an XML file consisting of large amount of small records record-by-record. However, re-writing the entire code (which consist of large amount of "process_record"-like methods) to work with standard SAX parser seems like an big task not worth the resources, even at the cost of living with XML::Simple. I'm looking for an existing module which will probably be based on a SAX parser (or anything fast with small memory footprint) which can be used to produce $record hashrefs one by one based on the XML pictured above that can be passed to $obj->process_record($record) and be 100% identical to what XML::Simple's hashrefs would have been. I don't care much what the interface of the new module is; e.g whether I need to call next_record() or give it a callback coderef accepting a record.

    Read the article

  • Problem with Command Pattern under Visual Studio 2008 (C++)

    - by D.Giunchi
    Dear All, I've a problem with this pattern under c++ on VS 2008. The same code has been tested in gcc (linux, mac and mingw for widnows) and it works. I copy/paste the code here: class MyCommand { public: virtual void execute() = 0; virtual ~MyCommand () {}; }; class MyOperation { public: virtual void DoIt() {}; //I also write it not inline }; class MyOperationDerived : public MyOperation { public: virtual void DoIt() {}; //I also write it not inline }; class MyUndoStackCommand : public MyCommand { public: typedef void(MyOperation::*Action)(); MyUndoStackCommand(MyOperation *rec, Action action); /*virtual*/ void execute(); /*virtual*/ ~MyUndoStackCommand(); private: MyOperation *myReceiver; Action myAction ; }; in cpp: #include "MyUndoStackCommand.h" #include "MyOperation.h" MyUndoStackCommand::~MyUndoStackCommand() { } MyUndoStackCommand::MyUndoStackCommand(myOperation *rec, Action action): myReceiver(rec), myAction(action) { } void MyUndoStackCommand::execute() { ((myReceiver)->*(myAction))(); } use in main.cpp: MyReceiver receiver; MyUndoStackCommand usc(&receiver, &MyOperation::DoIt); usc.execute(); when I debug under visual studio only if I set inside MyUndoStackCommand, directly myAction = &MyOperation::DoIt , it works, otherwise not. Any advice? thank you very much, dan Edit: The following code compiles with g++ - changes by Neil Butterworth flagged as //NB. class MyCommand { public: virtual void execute() = 0; virtual ~MyCommand () {}; }; class MyOperation { public: virtual void DoIt() {}; //I also write it not inline }; class MyOperationDerived : public MyOperation { public: virtual void DoIt() {}; //I also write it not inline }; class MyUndoStackCommand : public MyCommand { public: typedef void(MyOperation::*Action)(); MyUndoStackCommand(MyOperation *rec, Action action); /*virtual*/ void execute(); /*virtual*/ ~MyUndoStackCommand(); private: MyOperation *myReceiver; Action myAction ; }; MyUndoStackCommand::~MyUndoStackCommand() { } MyUndoStackCommand::MyUndoStackCommand(MyOperation *rec, //NB Action action) : myReceiver(rec), myAction(action) { } void MyUndoStackCommand::execute() { ((myReceiver)->*(myAction))(); } int main() { MyOperation receiver; //NB MyUndoStackCommand usc(&receiver, &MyOperation::DoIt); usc.execute(); }

    Read the article

  • load data from grid row into (pop up) form for editing

    - by user1495457
    I read in Ext JS in Action ( by J. Garcia) that if we have an instance of Ext.data.Record, we can use the form's loadRecord method to set the form's values. However, he does not give a working example of this (in the example that he uses data is loaded into a form through a file called data.php). I have searched many forums and found the following entry helpful as it gave me an idea on how to solve my problem by using form's loadRecord method: load data from grid to form Now the code for my store and grid is as follows: var userstore = Ext.create('Ext.data.Store', { storeId: 'viewUsersStore', model: 'Configs', autoLoad: true, proxy: { type: 'ajax', url: '/user/getuserviewdata.castle', reader: { type: 'json', root: 'users' }, listeners: { exception: function (proxy, response, operation, eOpts) { Ext.MessageBox.alert("Error", "Session has timed-out. Please re-login and try again."); } } } }); var grid = Ext.create('Ext.grid.Panel', { id: 'viewUsersGrid', title: 'List of all users', store: Ext.data.StoreManager.lookup('viewUsersStore'), columns: [ { header: 'Username', dataIndex: 'username' }, { header: 'Full Name', dataIndex: 'fullName' }, { header: 'Company', dataIndex: 'companyName' }, { header: 'Latest Time Login', dataIndex: 'lastLogin' }, { header: 'Current Status', dataIndex: 'status' }, { header: 'Edit', menuDisabled: true, sortable: false, xtype: 'actioncolumn', width: 50, items: [{ icon: '../../../Content/ext/img/icons/fam/user_edit.png', tooltip: 'Edit user', handler: function (grid, rowIndex, colIndex) { var rec = userstore.getAt(rowIndex); alert("Edit " + rec.get('username')+ "?"); EditUser(rec.get('id')); } }] }, ] }); function EditUser(id) { //I think I need to have this code here - I don't think it's complete/correct though var formpanel = Ext.getCmp('CreateUserForm'); formpanel.getForm().loadRecord(rec); } 'CreateUserForm' is the ID of a form that already exists and which should appear when user clicks on Edit icon. That pop-up form should then automatically be populated with the correct data from the grid row. However my code is not working. I get an error at the line 'formpanel.getForm().loadRecord(rec)' - it says 'Microsoft JScript runtime error: 'undefined' is null or not an object'. Any tips on how to solve this?

    Read the article

  • hibernate: com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException

    - by user121196
    when saving an object to database using hibernate, sometimes it fails because certain fields of the object exceed the maximum varchar length defined in the database. There force I am using the following approach: 1. attempt to save 2. if getting an DataException, then I truncate the fields in the object to the max length specified in the db definition, then try to save again. However, in the second save after truncation. I'm getting the following exception: hibernate: com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Cannot add or update a child row: a foreign key constraint fails here's the relevant code, what's wrong? public static void saveLenientObject(Object obj){ try { save2(rec); } catch (org.hibernate.exception.DataException e) { // TODO Auto-generated catch block e.printStackTrace(); saveLenientObject(rec, e); } catch (Exception e) { // TODO Auto-generated catch block e.printStackTrace(); } } private static void saveLenientObject(Object rec, DataException e) { Util.truncateObject(rec); System.out.println("after truncation "); save2(rec); } public static void save2(Object obj) throws Exception{ try{ beginTransaction(); getSession().save(obj); commitTransaction(); }catch(Exception e){ e.printStackTrace(); rollbackTransaction(); //closeSession(); throw e; }finally{ closeSession(); } }

    Read the article

  • mythbuntu 12 - lirc device doesn't appear to even exist

    - by FrustratedWithFormsDesigner
    I'm trying to get a new installation of Mythbuntu working. So far, everything is OK except the remote. The sensor for the remote is on my Hauppauge WinTV HVR 1250. First I tried to run irw to see what was being picked up by the sensor: $ irw connect: No such file or directory Then trying to run lircd gives: $ lircd start$ lircd start lircd: can't open or create /var/run/lirc/lircd.pid I look for any lirc devices and find there are none: $ ls /dev/li* ls: cannot access /dev/li*: No such file or directory Just to be sure, I check in /proc/bus/input/devices, which shows me two powerbuttons (not sure why), kbd and mouse dev, and the audio devs. Nothing for the IR receiver on the tuner card (which I thought was strange because shouldn't the tuner show up here?). $ cat /proc/bus/input/devices I: Bus=0019 Vendor=0000 Product=0001 Version=0000 N: Name="Power Button" P: Phys=PNP0C0C/button/input0 S: Sysfs=/devices/LNXSYSTM:00/device:00/PNP0C0C:00/input/input0 U: Uniq= H: Handlers=kbd event0 B: PROP=0 B: EV=3 B: KEY=10000000000000 0 I: Bus=0019 Vendor=0000 Product=0001 Version=0000 N: Name="Power Button" P: Phys=LNXPWRBN/button/input0 S: Sysfs=/devices/LNXSYSTM:00/LNXPWRBN:00/input/input1 U: Uniq= H: Handlers=kbd event1 B: PROP=0 B: EV=3 B: KEY=10000000000000 0 I: Bus=0003 Vendor=099a Product=7202 Version=0111 N: Name="Wireless Keyboard/Mouse" P: Phys=usb-0000:00:10.1-2/input0 S: Sysfs=/devices/pci0000:00/0000:00:10.1/usb8/8-2/8-2:1.0/input/input2 U: Uniq= H: Handlers=sysrq kbd event2 B: PROP=0 B: EV=120013 B: KEY=1000000000007 ff9f207ac14057ff febeffdfffefffff fffffffffffffffe B: MSC=10 B: LED=7 I: Bus=0003 Vendor=099a Product=7202 Version=0111 N: Name="Wireless Keyboard/Mouse" P: Phys=usb-0000:00:10.1-2/input1 S: Sysfs=/devices/pci0000:00/0000:00:10.1/usb8/8-2/8-2:1.1/input/input3 U: Uniq= H: Handlers=kbd mouse0 event3 B: PROP=0 B: EV=1f B: KEY=4837fff072ff32d bf54444600000000 70001 20c100b17c000 267bfad9415fed 9e168000004400 10000002 B: REL=143 B: ABS=100000000 B: MSC=10 I: Bus=0000 Vendor=0000 Product=0000 Version=0000 N: Name="HD-Audio Generic Line" P: Phys=ALSA S: Sysfs=/devices/pci0000:00/0000:00:14.2/sound/card0/input4 U: Uniq= H: Handlers=event4 B: PROP=0 B: EV=21 B: SW=2000 I: Bus=0000 Vendor=0000 Product=0000 Version=0000 N: Name="HD-Audio Generic Front Mic" P: Phys=ALSA S: Sysfs=/devices/pci0000:00/0000:00:14.2/sound/card0/input5 U: Uniq= H: Handlers=event5 B: PROP=0 B: EV=21 B: SW=10 I: Bus=0000 Vendor=0000 Product=0000 Version=0000 N: Name="HD-Audio Generic Rear Mic" P: Phys=ALSA S: Sysfs=/devices/pci0000:00/0000:00:14.2/sound/card0/input6 U: Uniq= H: Handlers=event6 B: PROP=0 B: EV=21 B: SW=10 I: Bus=0000 Vendor=0000 Product=0000 Version=0000 N: Name="HD-Audio Generic Front Headphone" P: Phys=ALSA S: Sysfs=/devices/pci0000:00/0000:00:14.2/sound/card0/input7 U: Uniq= H: Handlers=event7 B: PROP=0 B: EV=21 B: SW=4 I: Bus=0000 Vendor=0000 Product=0000 Version=0000 N: Name="HD-Audio Generic Line-Out" P: Phys=ALSA S: Sysfs=/devices/pci0000:00/0000:00:14.2/sound/card0/input8 U: Uniq= H: Handlers=event8 B: PROP=0 B: EV=21 B: SW=40 According to dmesg, the driver was registered, but it doesn't look like any devices was associated with the driver: $ dmesg | grep irc [ 10.631162] lirc_dev: IR Remote Control driver registered, major 249 So far, I've seen a number of forum pages suggesting that I use some trick to create a link between /dev/lirc and some other device that is the REAL IR sensor, like /dev/event5, but those cases assume that the real device is shown from /proc/bus/input/devices, and I don't see any such device there. Any suggestions on how to fix or further diagnose this?

    Read the article

  • PARTNER WEBCAST- INNOVATIONS IN PRODUCTS PROGRAM (FORMERLY KNOWN AS COMPETENCE VIRTUAL)

    - by mseika
    PARTNER WEBCAST- INNOVATIONS IN PRODUCTS PROGRAM (FORMERLY KNOWN AS COMPETENCE VIRTUAL) JULY 2ND, 2012 AT 04:00 PM CET (03:00 PM GMT)I am pleased to invite you to join the Innovations in Products –webcast. Innovations in Products will present Oracle Applications' Product's new functions and features including sales positioning. The key objectives of these webcasts are to inspire System Integrator's implementation personnel to conduct successful after sales in their Customer projects. Innovations in Products will be presented on the 1st Monday of each quarter after the billable day (4:00 to 5:00 PM CET). The webcast is intended for System Integrator's Implementation Certified Specialists but Innovations in Products is open for other interested Oracle Applications system Integrator's personnel as well. At first, two Oracle representatives will discuss Oracle's contribution to Partners. Then you will see product breakout session followed by Q&A with Oracle Experts. Each session will last for maximum 1 hour. A Q&A Document covering all questions and answers will be made available after the webcast. What are the Benefits for partners? Find out how Innovations in Products helps you to improve your after sales Discover new functions and features so you can enrich your Customers's solution Learn more about Oracle Applications products, especially sales positioning Hear crucial questions raised by colleague alike, learn from their interest Engage and present your questions to subject experts Be inspired of the richness of Oracle Application portfolio – for your and your customer’s benefit. Note: Should you already be familiar with a specific Product, then choose another one. Doing so you would expand your knowledge of the overall Applications portfolio. Some presentations contain product demonstration, although these presentations are not intended to be extremely detailed technical presentations. Note: At the latter part of this email you have also 17 links into the recent Applications Products presentations and 6 links into the Public Sector Value Proposition presentations that were presented in Innovations in Industries -program. Product breakout sessions: Fusion Applications Technology and Extensibility Fusion Applications - Transforming your Back-Office Accounting Function Fusion HCM & Talent Overview & Extensibility Fusion HCM Compensation Planning Enterprise PLM for the Product Value Chain Oracle's Asset Management and Maintenance Solution For more details please visit Innovations in Products and other breakout sessions on OPN page. Delivery Format Innovations in Products –program is a series of FREE prerecorded Applications product presentations followed by Q&A. It will be delivered over the Web. Participants have the opportunity to submit questions during the web cast via chat and subject matter experts will provide verbal answers live. Innovations in Products consists of several parallel prerecorded product breakout sessions, each lasting for max. 1 hour. At first, two Oracle representatives will discuss Oracle’s contribution to Partners. Then you’ll see the product breakout sessions followed by Q&A with Oracle Experts. A Q&A document covering all questions and answers will be made available after the webcast. You can also see Innovations in Products afterwards as its content will be available online for the next 6-12 months.The next Innovations in Products web casts will be presented as follows: July 2nd 2012 October 1st 2012 January 14th 2013 April 8th 2013. Note: Depending on local network bandwidth please allow some seconds time the presentations to download. You might want to refresh your screen by pressing F5. DurationMaximum 1 hour For further information please contact me Markku Rouhiainen.

    Read the article

  • PARTNER WEBCAST- INNOVATIONS IN PRODUCTS PROGRAM (FORMERLY KNOWN AS COMPETENCE VIRTUAL)

    - by mseika
    PARTNER WEBCAST- INNOVATIONS IN PRODUCTS PROGRAM (FORMERLY KNOWN AS COMPETENCE VIRTUAL) JULY 2ND, 2012 AT 04:00 PM CET (03:00 PM GMT)I am pleased to invite you to join the Innovations in Products –webcast. Innovations in Products will present Oracle Applications' Product's new functions and features including sales positioning. The key objectives of these webcasts are to inspire System Integrator's implementation personnel to conduct successful after sales in their Customer projects. Innovations in Products will be presented on the 1st Monday of each quarter after the billable day (4:00 to 5:00 PM CET). The webcast is intended for System Integrator's Implementation Certified Specialists but Innovations in Products is open for other interested Oracle Applications system Integrator's personnel as well. At first, two Oracle representatives will discuss Oracle's contribution to Partners. Then you will see product breakout session followed by Q&A with Oracle Experts. Each session will last for maximum 1 hour. A Q&A Document covering all questions and answers will be made available after the webcast. What are the Benefits for partners? Find out how Innovations in Products helps you to improve your after sales Discover new functions and features so you can enrich your Customers's solution Learn more about Oracle Applications products, especially sales positioning Hear crucial questions raised by colleague alike, learn from their interest Engage and present your questions to subject experts Be inspired of the richness of Oracle Application portfolio – for your and your customer’s benefit. Note: Should you already be familiar with a specific Product, then choose another one. Doing so you would expand your knowledge of the overall Applications portfolio. Some presentations contain product demonstration, although these presentations are not intended to be extremely detailed technical presentations. Note: At the latter part of this email you have also 17 links into the recent Applications Products presentations and 6 links into the Public Sector Value Proposition presentations that were presented in Innovations in Industries -program. Product breakout sessions: Fusion Applications Technology and Extensibility Fusion Applications - Transforming your Back-Office Accounting Function Fusion HCM & Talent Overview & Extensibility Fusion HCM Compensation Planning Enterprise PLM for the Product Value Chain Oracle's Asset Management and Maintenance Solution For more details please visit Innovations in Products and other breakout sessions on OPN page. Delivery Format Innovations in Products –program is a series of FREE prerecorded Applications product presentations followed by Q&A. It will be delivered over the Web. Participants have the opportunity to submit questions during the web cast via chat and subject matter experts will provide verbal answers live. Innovations in Products consists of several parallel prerecorded product breakout sessions, each lasting for max. 1 hour. At first, two Oracle representatives will discuss Oracle’s contribution to Partners. Then you’ll see the product breakout sessions followed by Q&A with Oracle Experts. A Q&A document covering all questions and answers will be made available after the webcast. You can also see Innovations in Products afterwards as its content will be available online for the next 6-12 months.The next Innovations in Products web casts will be presented as follows: July 2nd 2012 October 1st 2012 January 14th 2013 April 8th 2013. Note: Depending on local network bandwidth please allow some seconds time the presentations to download. You might want to refresh your screen by pressing F5. DurationMaximum 1 hour For further information please contact me Markku Rouhiainen.

    Read the article

  • PARTNER WEBCAST: INNOVATIONS IN PRODUCTS - PROGRAM

    - by mseika
    PARTNER WEBCAST: INNOVATIONS IN PRODUCTS - PROGRAMOCTOBER 1ST, 2012 AT 04:00 PM CET (03:00 PM GMT)Dear partner I am pleased to invite you to join the Innovations in Products – webcast. Innovations in Products will present Oracle Product's new functions and features including sales positioning. The key objectives of these webcasts are to inspire System Integrator's implementation personnel to conduct successful after sales in their Customer projects. Innovations in Products will be presented on the 1st Monday of each quarter after the billable day (4:00 to 5:00 PM CET). The webcast is intended for System Integrator's Implementation Certified Specialists but Innovations in Products is open for other system Integrator's personnel as well. At first, two Oracle representatives will discuss Oracle's contribution to partners. Then you will see product breakout session followed by Q&A with Oracle Experts. Each session will last for maximum 1 hour. A Q&A document covering all questions and answers will be made available after the webcast. What are the Benefits for partners? Find out how Innovations in Products helps you to improve your after sales Discover new functions and features so you can enrich your Customers's solution Learn more about Oracle products, especially sales positioning Hear crucial questions raised by colleague alike, learn from their interest Engage and present your questions to subject experts Be inspired of the richness of Oracle's product portfolio – for your and your customer's benefit. Note: Should you already be familiar with a specific Product, then choose another one. Doing so you would expand your knowledge of the overall product portfolio. Some presentations contain product demonstration, although these presentations are not intended to be extremely detailed technical presentations. To access previously presented 23 Applications Products presentations and 6 Public Sector Value Proposition presentations, please click here. You might want to bookmark the overall registration page Innovations in Products October 1stand the global event calendar page events.oracle.com. Delivery Format Innovations in Products – program is a series of FREE prerecorded Oracle product presentations followed by Q&A. It will be delivered over the Web. Participants have the opportunity to submit questions during the web cast via chat and subject matter experts will provide verbal answers live. Innovations in Products consists of several parallel prerecorded product breakout sessions, each lasting for max. 1 hour. At first, two Oracle representatives will discuss Oracle's contribution to Partners. Then you'll see the product breakout sessions followed by Q&A with Oracle Experts. A Q&A document covering all questions and answers will be made available after the webcast. You can also see Innovations in Products afterwards as its content will be available online for the next 6-12 months.The next Innovations in Products web casts will be presented as follows: October 1st 2012 January 14th 2013 April 8th 2013. Note: Depending on local network bandwidth please allow some seconds time the presentations to download. You might want to refresh your screen by pressing F5. DurationMaximum 1 hour For further information please contact me Markku Rouhiainen. Best regards Markku RouhiainenDirector, Applications Partner Enablement EMEA

    Read the article

  • PARTNER WEBCAST: INNOVATIONS IN PRODUCTS - PROGRAM

    - by mseika
    PARTNER WEBCAST: INNOVATIONS IN PRODUCTS - PROGRAMOCTOBER 1ST, 2012 AT 04:00 PM CET (03:00 PM GMT)Dear partner I am pleased to invite you to join the Innovations in Products – webcast. Innovations in Products will present Oracle Product's new functions and features including sales positioning. The key objectives of these webcasts are to inspire System Integrator's implementation personnel to conduct successful after sales in their Customer projects. Innovations in Products will be presented on the 1st Monday of each quarter after the billable day (4:00 to 5:00 PM CET). The webcast is intended for System Integrator's Implementation Certified Specialists but Innovations in Products is open for other system Integrator's personnel as well. At first, two Oracle representatives will discuss Oracle's contribution to partners. Then you will see product breakout session followed by Q&A with Oracle Experts. Each session will last for maximum 1 hour. A Q&A document covering all questions and answers will be made available after the webcast. What are the Benefits for partners? Find out how Innovations in Products helps you to improve your after sales Discover new functions and features so you can enrich your Customers's solution Learn more about Oracle products, especially sales positioning Hear crucial questions raised by colleague alike, learn from their interest Engage and present your questions to subject experts Be inspired of the richness of Oracle's product portfolio – for your and your customer's benefit. Note: Should you already be familiar with a specific Product, then choose another one. Doing so you would expand your knowledge of the overall product portfolio. Some presentations contain product demonstration, although these presentations are not intended to be extremely detailed technical presentations. To access previously presented 23 Applications Products presentations and 6 Public Sector Value Proposition presentations, please click here. You might want to bookmark the overall registration page Innovations in Products October 1stand the global event calendar page events.oracle.com. Delivery Format Innovations in Products – program is a series of FREE prerecorded Oracle product presentations followed by Q&A. It will be delivered over the Web. Participants have the opportunity to submit questions during the web cast via chat and subject matter experts will provide verbal answers live. Innovations in Products consists of several parallel prerecorded product breakout sessions, each lasting for max. 1 hour. At first, two Oracle representatives will discuss Oracle's contribution to Partners. Then you'll see the product breakout sessions followed by Q&A with Oracle Experts. A Q&A document covering all questions and answers will be made available after the webcast. You can also see Innovations in Products afterwards as its content will be available online for the next 6-12 months.The next Innovations in Products web casts will be presented as follows: October 1st 2012 January 14th 2013 April 8th 2013. Note: Depending on local network bandwidth please allow some seconds time the presentations to download. You might want to refresh your screen by pressing F5. DurationMaximum 1 hour For further information please contact me Markku Rouhiainen. Best regards Markku RouhiainenDirector, Applications Partner Enablement EMEA

    Read the article

< Previous Page | 44 45 46 47 48 49 50 51 52 53 54 55  | Next Page >