Search Results

Search found 10878 results on 436 pages for 'changed'.

Page 89/436 | < Previous Page | 85 86 87 88 89 90 91 92 93 94 95 96  | Next Page >

  • Daisies and Rye Swaying in the Summer Wind Wallpaper

    - by Asian Angel
    Flowers [DesktopNexus] Latest Features How-To Geek ETC Should You Delete Windows 7 Service Pack Backup Files to Save Space? What Can Super Mario Teach Us About Graphics Technology? Windows 7 Service Pack 1 is Released: But Should You Install It? How To Make Hundreds of Complex Photo Edits in Seconds With Photoshop Actions How to Enable User-Specific Wireless Networks in Windows 7 How to Use Google Chrome as Your Default PDF Reader (the Easy Way) Reclaim Vertical UI Space by Moving Your Tabs to the Side in Firefox Wind and Water: Puzzle Battles – An Awesome Game for Linux and Windows How Star Wars Changed the World [Infographic] Tabs Visual Manager Adds Thumbnailed Tab Switching to Chrome Daisies and Rye Swaying in the Summer Wind Wallpaper Read On Phone Pushes Data from Your Desktop to the Appropriate Android App

    Read the article

  • New article available in "SOA Suite Essentials for WLI Users" series: Dynamic Data Lookup in a Busin

    - by simone.geib
    It is my pleasure to announce the publishing of another article in our "SOA Suite Essentials for WLI Users" series: "Dynamic Data Lookup in a Business Process: Meta Data Cache Control in Oracle WebLogic Integration and Domain Value Maps in SOA Suite". This article explains how dynamic data can be retrieved in a business process using Domain Value Maps in SOA Suite and shows the similarities to the WLI XML MetaData Cache Control. Lots of customers have asked about this comparison and I hope they will find it useful. The article follows "Setting Web Service and JCA Adapter Endpoints Dynamically in Oracle SOA Suite" which describes how web services and JCA adapter endpoints in SOA Suite can be changed at run-time, and so completes the use case where a BPEL process writes to a file (via file adapter) and the output directory and the file name are set dynamically. Please let me know what you think about the series and this specific article.

    Read the article

  • Node.js Adventure - Host Node.js on Windows Azure Worker Role

    - by Shaun
    In my previous post I demonstrated about how to develop and deploy a Node.js application on Windows Azure Web Site (a.k.a. WAWS). WAWS is a new feature in Windows Azure platform. Since it’s low-cost, and it provides IIS and IISNode components so that we can host our Node.js application though Git, FTP and WebMatrix without any configuration and component installation. But sometimes we need to use the Windows Azure Cloud Service (a.k.a. WACS) and host our Node.js on worker role. Below are some benefits of using worker role. - WAWS leverages IIS and IISNode to host Node.js application, which runs in x86 WOW mode. It reduces the performance comparing with x64 in some cases. - WACS worker role does not need IIS, hence there’s no restriction of IIS, such as 8000 concurrent requests limitation. - WACS provides more flexibility and controls to the developers. For example, we can RDP to the virtual machines of our worker role instances. - WACS provides the service configuration features which can be changed when the role is running. - WACS provides more scaling capability than WAWS. In WAWS we can have at most 3 reserved instances per web site while in WACS we can have up to 20 instances in a subscription. - Since when using WACS worker role we starts the node by ourselves in a process, we can control the input, output and error stream. We can also control the version of Node.js.   Run Node.js in Worker Role Node.js can be started by just having its execution file. This means in Windows Azure, we can have a worker role with the “node.exe” and the Node.js source files, then start it in Run method of the worker role entry class. Let’s create a new windows azure project in Visual Studio and add a new worker role. Since we need our worker role execute the “node.exe” with our application code we need to add the “node.exe” into our project. Right click on the worker role project and add an existing item. By default the Node.js will be installed in the “Program Files\nodejs” folder so we can navigate there and add the “node.exe”. Then we need to create the entry code of Node.js. In WAWS the entry file must be named “server.js”, which is because it’s hosted by IIS and IISNode and IISNode only accept “server.js”. But here as we control everything we can choose any files as the entry code. For example, I created a new JavaScript file named “index.js” in project root. Since we created a C# Windows Azure project we cannot create a JavaScript file from the context menu “Add new item”. We have to create a text file, and then rename it to JavaScript extension. After we added these two files we should set their “Copy to Output Directory” property to “Copy Always”, or “Copy if Newer”. Otherwise they will not be involved in the package when deployed. Let’s paste a very simple Node.js code in the “index.js” as below. As you can see I created a web server listening at port 12345. 1: var http = require("http"); 2: var port = 12345; 3:  4: http.createServer(function (req, res) { 5: res.writeHead(200, { "Content-Type": "text/plain" }); 6: res.end("Hello World\n"); 7: }).listen(port); 8:  9: console.log("Server running at port %d", port); Then we need to start “node.exe” with this file when our worker role was started. This can be done in its Run method. I found the Node.js and entry JavaScript file name, and then create a new process to run it. Our worker role will wait for the process to be exited. If everything is OK once our web server was opened the process will be there listening for incoming requests, and should not be terminated. The code in worker role would be like this. 1: public override void Run() 2: { 3: // This is a sample worker implementation. Replace with your logic. 4: Trace.WriteLine("NodejsHost entry point called", "Information"); 5:  6: // retrieve the node.exe and entry node.js source code file name. 7: var node = Environment.ExpandEnvironmentVariables(@"%RoleRoot%\approot\node.exe"); 8: var js = "index.js"; 9:  10: // prepare the process starting of node.exe 11: var info = new ProcessStartInfo(node, js) 12: { 13: CreateNoWindow = false, 14: ErrorDialog = true, 15: WindowStyle = ProcessWindowStyle.Normal, 16: UseShellExecute = false, 17: WorkingDirectory = Environment.ExpandEnvironmentVariables(@"%RoleRoot%\approot") 18: }; 19: Trace.WriteLine(string.Format("{0} {1}", node, js), "Information"); 20:  21: // start the node.exe with entry code and wait for exit 22: var process = Process.Start(info); 23: process.WaitForExit(); 24: } Then we can run it locally. In the computer emulator UI the worker role started and it executed the Node.js, then Node.js windows appeared. Open the browser to verify the website hosted by our worker role. Next let’s deploy it to azure. But we need some additional steps. First, we need to create an input endpoint. By default there’s no endpoint defined in a worker role. So we will open the role property window in Visual Studio, create a new input TCP endpoint to the port we want our website to use. In this case I will use 80. Even though we created a web server we should add a TCP endpoint of the worker role, since Node.js always listen on TCP instead of HTTP. And then changed the “index.js”, let our web server listen on 80. 1: var http = require("http"); 2: var port = 80; 3:  4: http.createServer(function (req, res) { 5: res.writeHead(200, { "Content-Type": "text/plain" }); 6: res.end("Hello World\n"); 7: }).listen(port); 8:  9: console.log("Server running at port %d", port); Then publish it to Windows Azure. And then in browser we can see our Node.js website was running on WACS worker role. We may encounter an error if we tried to run our Node.js website on 80 port at local emulator. This is because the compute emulator registered 80 and map the 80 endpoint to 81. But our Node.js cannot detect this operation. So when it tried to listen on 80 it will failed since 80 have been used.   Use NPM Modules When we are using WAWS to host Node.js, we can simply install modules we need, and then just publish or upload all files to WAWS. But if we are using WACS worker role, we have to do some extra steps to make the modules work. Assuming that we plan to use “express” in our application. Firstly of all we should download and install this module through NPM command. But after the install finished, they are just in the disk but not included in the worker role project. If we deploy the worker role right now the module will not be packaged and uploaded to azure. Hence we need to add them to the project. On solution explorer window click the “Show all files” button, select the “node_modules” folder and in the context menu select “Include In Project”. But that not enough. We also need to make all files in this module to “Copy always” or “Copy if newer”, so that they can be uploaded to azure with the “node.exe” and “index.js”. This is painful step since there might be many files in a module. So I created a small tool which can update a C# project file, make its all items as “Copy always”. The code is very simple. 1: static void Main(string[] args) 2: { 3: if (args.Length < 1) 4: { 5: Console.WriteLine("Usage: copyallalways [project file]"); 6: return; 7: } 8:  9: var proj = args[0]; 10: File.Copy(proj, string.Format("{0}.bak", proj)); 11:  12: var xml = new XmlDocument(); 13: xml.Load(proj); 14: var nsManager = new XmlNamespaceManager(xml.NameTable); 15: nsManager.AddNamespace("pf", "http://schemas.microsoft.com/developer/msbuild/2003"); 16:  17: // add the output setting to copy always 18: var contentNodes = xml.SelectNodes("//pf:Project/pf:ItemGroup/pf:Content", nsManager); 19: UpdateNodes(contentNodes, xml, nsManager); 20: var noneNodes = xml.SelectNodes("//pf:Project/pf:ItemGroup/pf:None", nsManager); 21: UpdateNodes(noneNodes, xml, nsManager); 22: xml.Save(proj); 23:  24: // remove the namespace attributes 25: var content = xml.InnerXml.Replace("<CopyToOutputDirectory xmlns=\"\">", "<CopyToOutputDirectory>"); 26: xml.LoadXml(content); 27: xml.Save(proj); 28: } 29:  30: static void UpdateNodes(XmlNodeList nodes, XmlDocument xml, XmlNamespaceManager nsManager) 31: { 32: foreach (XmlNode node in nodes) 33: { 34: var copyToOutputDirectoryNode = node.SelectSingleNode("pf:CopyToOutputDirectory", nsManager); 35: if (copyToOutputDirectoryNode == null) 36: { 37: var n = xml.CreateNode(XmlNodeType.Element, "CopyToOutputDirectory", null); 38: n.InnerText = "Always"; 39: node.AppendChild(n); 40: } 41: else 42: { 43: if (string.Compare(copyToOutputDirectoryNode.InnerText, "Always", true) != 0) 44: { 45: copyToOutputDirectoryNode.InnerText = "Always"; 46: } 47: } 48: } 49: } Please be careful when use this tool. I created only for demo so do not use it directly in a production environment. Unload the worker role project, execute this tool with the worker role project file name as the command line argument, it will set all items as “Copy always”. Then reload this worker role project. Now let’s change the “index.js” to use express. 1: var express = require("express"); 2: var app = express(); 3:  4: var port = 80; 5:  6: app.configure(function () { 7: }); 8:  9: app.get("/", function (req, res) { 10: res.send("Hello Node.js!"); 11: }); 12:  13: app.get("/User/:id", function (req, res) { 14: var id = req.params.id; 15: res.json({ 16: "id": id, 17: "name": "user " + id, 18: "company": "IGT" 19: }); 20: }); 21:  22: app.listen(port); Finally let’s publish it and have a look in browser.   Use Windows Azure SQL Database We can use Windows Azure SQL Database (a.k.a. WACD) from Node.js as well on worker role hosting. Since we can control the version of Node.js, here we can use x64 version of “node-sqlserver” now. This is better than if we host Node.js on WAWS since it only support x86. Just install the “node-sqlserver” module from NPM, copy the “sqlserver.node” from “Build\Release” folder to “Lib” folder. Include them in worker role project and run my tool to make them to “Copy always”. Finally update the “index.js” to use WASD. 1: var express = require("express"); 2: var sql = require("node-sqlserver"); 3:  4: var connectionString = "Driver={SQL Server Native Client 10.0};Server=tcp:{SERVER NAME}.database.windows.net,1433;Database={DATABASE NAME};Uid={LOGIN}@{SERVER NAME};Pwd={PASSWORD};Encrypt=yes;Connection Timeout=30;"; 5: var port = 80; 6:  7: var app = express(); 8:  9: app.configure(function () { 10: app.use(express.bodyParser()); 11: }); 12:  13: app.get("/", function (req, res) { 14: sql.open(connectionString, function (err, conn) { 15: if (err) { 16: console.log(err); 17: res.send(500, "Cannot open connection."); 18: } 19: else { 20: conn.queryRaw("SELECT * FROM [Resource]", function (err, results) { 21: if (err) { 22: console.log(err); 23: res.send(500, "Cannot retrieve records."); 24: } 25: else { 26: res.json(results); 27: } 28: }); 29: } 30: }); 31: }); 32:  33: app.get("/text/:key/:culture", function (req, res) { 34: sql.open(connectionString, function (err, conn) { 35: if (err) { 36: console.log(err); 37: res.send(500, "Cannot open connection."); 38: } 39: else { 40: var key = req.params.key; 41: var culture = req.params.culture; 42: var command = "SELECT * FROM [Resource] WHERE [Key] = '" + key + "' AND [Culture] = '" + culture + "'"; 43: conn.queryRaw(command, function (err, results) { 44: if (err) { 45: console.log(err); 46: res.send(500, "Cannot retrieve records."); 47: } 48: else { 49: res.json(results); 50: } 51: }); 52: } 53: }); 54: }); 55:  56: app.get("/sproc/:key/:culture", function (req, res) { 57: sql.open(connectionString, function (err, conn) { 58: if (err) { 59: console.log(err); 60: res.send(500, "Cannot open connection."); 61: } 62: else { 63: var key = req.params.key; 64: var culture = req.params.culture; 65: var command = "EXEC GetItem '" + key + "', '" + culture + "'"; 66: conn.queryRaw(command, function (err, results) { 67: if (err) { 68: console.log(err); 69: res.send(500, "Cannot retrieve records."); 70: } 71: else { 72: res.json(results); 73: } 74: }); 75: } 76: }); 77: }); 78:  79: app.post("/new", function (req, res) { 80: var key = req.body.key; 81: var culture = req.body.culture; 82: var val = req.body.val; 83:  84: sql.open(connectionString, function (err, conn) { 85: if (err) { 86: console.log(err); 87: res.send(500, "Cannot open connection."); 88: } 89: else { 90: var command = "INSERT INTO [Resource] VALUES ('" + key + "', '" + culture + "', N'" + val + "')"; 91: conn.queryRaw(command, function (err, results) { 92: if (err) { 93: console.log(err); 94: res.send(500, "Cannot retrieve records."); 95: } 96: else { 97: res.send(200, "Inserted Successful"); 98: } 99: }); 100: } 101: }); 102: }); 103:  104: app.listen(port); Publish to azure and now we can see our Node.js is working with WASD through x64 version “node-sqlserver”.   Summary In this post I demonstrated how to host our Node.js in Windows Azure Cloud Service worker role. By using worker role we can control the version of Node.js, as well as the entry code. And it’s possible to do some pre jobs before the Node.js application started. It also removed the IIS and IISNode limitation. I personally recommended to use worker role as our Node.js hosting. But there are some problem if you use the approach I mentioned here. The first one is, we need to set all JavaScript files and module files as “Copy always” or “Copy if newer” manually. The second one is, in this way we cannot retrieve the cloud service configuration information. For example, we defined the endpoint in worker role property but we also specified the listening port in Node.js hardcoded. It should be changed that our Node.js can retrieve the endpoint. But I can tell you it won’t be working here. In the next post I will describe another way to execute the “node.exe” and Node.js application, so that we can get the cloud service configuration in Node.js. I will also demonstrate how to use Windows Azure Storage from Node.js by using the Windows Azure Node.js SDK.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • This web part does not have a valid XSLT stylesheet: There is no XSL property available for presenting the data.

    - by Patrick Olurotimi Ige
    I have been thinking for a while how i can reuse my code when building custom dataview webparts in sharepoint designer 2010.So i decided to use the XslLink which is one of the properties when you edit a sharepoint webpart.I started by creating a xsl file that i can use but after adding the link to the file like so:<XslLink>sites/server/mycustomtemplate.xsl</XslLink>I get the error : This web part does not have  a valid XSLT stylesheet: There is no XSL property available for presenting the data.So after some debugging i noticed it was the directory path for the link to the XSL style shee gets broken.So i changed it to  the full URL  http://mysite/sites/server/mycustomtemplate.xsl it works Enjoy

    Read the article

  • Updating the managed debugging API for .NET v4

    - by Brian Donahue
    In any successful investigation, the right tools play a big part in collecting evidence about the state of the "crime scene" as it was before the detectives arrived. Unfortunately for the Crash Scene Investigator, we don't have the budget to fly out to the customer's site, chalk the outline, and eat their doughnuts. We have to rely on the end-user to collect the evidence for us, which means giving them the fingerprint dust and the evidence baggies and leaving them to it. With that in mind, the Red Gate support team have been writing tools that can collect vital clues with a minimum of fuss. Years ago we would have asked for a memory dump, where we used to get the customer to run CDB.exe and produce dumps that we could analyze in-house, but those dumps were pretty unwieldy (500MB files) and the debugger often didn't dump exactly where we wanted, or made five or more dumps. What we wanted was just the minimum state information from the program at the time of failure, so we produced a managed debugger that captured every first and second-chance exception and logged the stack and a minimal amount of variables from the memory of the application, which could all be exported as XML. This caused less inconvenience to the end-user because it is much easier to send a 65KB XML file in an email than a 500MB file containing all of the application's memory. We don't need to have the entire victim shipped out to us when we just want to know what was under the fingernails. The thing that made creating a managed debugging tool possible was the MDbg Engine example written by Microsoft as part of the Debugging Tools for Windows distribution. Since the ICorDebug interface is a bit difficult to understand, they had kindly created some wrappers that provided an event-driven debugging model that was perfect for our needs, but .NET 4 applications under debugging started complaining that "The debugger's protocol is incompatible with the debuggee". The introduction of .NET Framework v4 had changed the managed debugging API significantly, however, without an update for the MDbg Engine code! After a few hours of research, I had finally worked out that most of the version 4 ICorDebug interface still works much the same way in "legacy" v2 mode and there was a relatively easy fix for the problem in that you can still get a reference to legacy ICorDebug by changing the way the interface is created. In .NET v2, the interface was acquired using the CreateDebuggingInterfaceFromVersion method in mscoree.dll. In v4, you must first create IClrMetaHost, enumerate the runtimes, get an ICLRRuntimeInfo interface to the .NET 4 runtime from that, and use the GetInterface method in mscoree.dll to return a "legacy" ICorDebug interface. The rest of the MDbg Engine will continue working the old way. Here is how I had changed the MDbg Engine code to support .NET v4: private void InitFromVersion(string debuggerVersion){if( debuggerVersion.StartsWith("v1") ){throw new ArgumentException( "Can't debug a version 1 CLR process (\"" + debuggerVersion + "\"). Run application in a version 2 CLR, or use a version 1 debugger instead." );} ICorDebug rawDebuggingAPI=null;if (debuggerVersion.StartsWith("v4")){Guid CLSID_MetaHost = new Guid("9280188D-0E8E-4867-B30C-7FA83884E8DE"); Guid IID_MetaHost = new Guid("D332DB9E-B9B3-4125-8207-A14884F53216"); ICLRMetaHost metahost = (ICLRMetaHost)NativeMethods.ClrCreateInterface(CLSID_MetaHost, IID_MetaHost); IEnumUnknown runtimes = metahost.EnumerateInstalledRuntimes(); ICLRRuntimeInfo runtime = GetRuntime(runtimes, debuggerVersion); //Defined in metahost.hGuid CLSID_CLRDebuggingLegacy = new Guid(0xDF8395B5, 0xA4BA, 0x450b, 0xA7, 0x7C, 0xA9, 0xA4, 0x77, 0x62, 0xC5, 0x20);Guid IID_ICorDebug = new Guid("3D6F5F61-7538-11D3-8D5B-00104B35E7EF"); Object res;runtime.GetInterface(ref CLSID_CLRDebuggingLegacy, ref IID_ICorDebug, out res); rawDebuggingAPI = (ICorDebug)res; }elserawDebuggingAPI = NativeMethods.CreateDebuggingInterfaceFromVersion((int)CorDebuggerVersion.Whidbey,debuggerVersion);if (rawDebuggingAPI != null)InitFromICorDebug(rawDebuggingAPI);elsethrow new ArgumentException("Support for debugging version " + debuggerVersion + " is not yet implemented");} The changes above will ensure that the debugger can support .NET Framework v2 and v4 applications with the same codebase, but we do compile two different applications: one targeting v2 and the other v4. As a footnote I need to add that some missing native methods and wrappers, along with the EnumerateRuntimes method code, came from the Mindbg project on Codeplex. Another change is that when using the MDbgEngine.CreateProcess to launch a process in the debugger, do not supply a null as the final argument. This does not work any more because GetCORVersion always returns "v2.0.50727" as the function has been deprecated in .NET v4. What's worse is that on a system with only .NET 4, the user will be prompted to download and install .NET v2! Not nice! This works much better: proc = m_Debugger.CreateProcess(ProcessName, ProcessArgs, DebugModeFlag.Default,String.Format("v{0}.{1}.{2}",System.Environment.Version.Major,System.Environment.Version.Minor,System.Environment.Version.Build)); Microsoft "unofficially" plan on updating the MDbg samples soon, but if you have an MDbg-based application, you can get it working right now by changing one method a bit and adding a few new interfaces (ICLRMetaHost, IEnumUnknown, and ICLRRuntimeInfo). The new, non-legacy implementation of MDbg Engine will add new, interesting features like dump-file support and by association I assume garbage-collection/managed object stats, so it will be well worth looking into if you want to extend the functionality of a managed debugger going forward.

    Read the article

  • My new DNS change works from America but not Sweden

    - by Dougie
    Several hours ago i changed nameserver and DNS info on one of my domains at my domainregistar. When i access the domain from my home computers and when my friends access the domain they get to the old ip-adress hosting the dead site(We all live in Sweden). But when i access the website from my mobile phone or through google.com/translate or North American proxies the website is shown like it should. Why is this?, does it take time for change to take effect for diffrent locations/countries? I find it very strange and would like to start using my site now. Do you think it will change or could i've been doing something wrong?

    Read the article

  • New Article: SharePoint 2010 for Developers &ndash; Whats new?

    - by Sahil Malik
    SharePoint 2010 Training: more information This is an nice overview/beginners article about what is new in SharePoint 2010 from purely a developer point of view. Excerpt - “In some ways SharePoint 2007 was a brand new incarnation of the SharePoint product. For the very first time, ASP.NET 2.0 was applied properly to the product. Things such as master pages, membership providers, sitemap providers etc. were used heavily in SharePoint. As a result, SharePoint 2007 got a whole new developer story to it. But in some ways it was a first version of a big product, so the development story left us wanting for more. Wanting for more because in some ways the API wasn’t ideal, and most certainly the development tools were somewhere between non-existent to bad. Diagnosing SharePoint errors was another frustrating story many have endured. What has changed in SharePoint 2010? Let’s find out.” Read full article ....

    Read the article

  • SQL SERVER – Beginning of SQL Server Architecture – Terminology – Guest Post

    - by pinaldave
    SQL Server Architecture is a very deep subject. Covering it in a single post is an almost impossible task. However, this subject is very popular topic among beginners and advanced users.  I have requested my friend Anil Kumar who is expert in SQL Domain to help me write  a simple post about Beginning SQL Server Architecture. As stated earlier this subject is very deep subject and in this first article series he has covered basic terminologies. In future article he will explore the subject further down. Anil Kumar Yadav is Trainer, SQL Domain, Koenig Solutions. Koenig is a premier IT training firm that provides several IT certifications, such as Oracle 11g, Server+, RHCA, SQL Server Training, Prince2 Foundation etc. In this Article we will discuss about MS SQL Server architecture. The major components of SQL Server are: Relational Engine Storage Engine SQL OS Now we will discuss and understand each one of them. 1) Relational Engine: Also called as the query processor, Relational Engine includes the components of SQL Server that determine what your query exactly needs to do and the best way to do it. It manages the execution of queries as it requests data from the storage engine and processes the results returned. Different Tasks of Relational Engine: Query Processing Memory Management Thread and Task Management Buffer Management Distributed Query Processing 2) Storage Engine: Storage Engine is responsible for storage and retrieval of the data on to the storage system (Disk, SAN etc.). to understand more, let’s focus on the following diagram. When we talk about any database in SQL server, there are 2 types of files that are created at the disk level – Data file and Log file. Data file physically stores the data in data pages. Log files that are also known as write ahead logs, are used for storing transactions performed on the database. Let’s understand data file and log file in more details: Data File: Data File stores data in the form of Data Page (8KB) and these data pages are logically organized in extents. Extents: Extents are logical units in the database. They are a combination of 8 data pages i.e. 64 KB forms an extent. Extents can be of two types, Mixed and Uniform. Mixed extents hold different types of pages like index, System, Object data etc. On the other hand, Uniform extents are dedicated to only one type. Pages: As we should know what type of data pages can be stored in SQL Server, below mentioned are some of them: Data Page: It holds the data entered by the user but not the data which is of type text, ntext, nvarchar(max), varchar(max), varbinary(max), image and xml data. Index: It stores the index entries. Text/Image: It stores LOB ( Large Object data) like text, ntext, varchar(max), nvarchar(max),  varbinary(max), image and xml data. GAM & SGAM (Global Allocation Map & Shared Global Allocation Map): They are used for saving information related to the allocation of extents. PFS (Page Free Space): Information related to page allocation and unused space available on pages. IAM (Index Allocation Map): Information pertaining to extents that are used by a table or index per allocation unit. BCM (Bulk Changed Map): Keeps information about the extents changed in a Bulk Operation. DCM (Differential Change Map): This is the information of extents that have modified since the last BACKUP DATABASE statement as per allocation unit. Log File: It also known as write ahead log. It stores modification to the database (DML and DDL). Sufficient information is logged to be able to: Roll back transactions if requested Recover the database in case of failure Write Ahead Logging is used to create log entries Transaction logs are written in chronological order in a circular way Truncation policy for logs is based on the recovery model SQL OS: This lies between the host machine (Windows OS) and SQL Server. All the activities performed on database engine are taken care of by SQL OS. It is a highly configurable operating system with powerful API (application programming interface), enabling automatic locality and advanced parallelism. SQL OS provides various operating system services, such as memory management deals with buffer pool, log buffer and deadlock detection using the blocking and locking structure. Other services include exception handling, hosting for external components like Common Language Runtime, CLR etc. I guess this brief article gives you an idea about the various terminologies used related to SQL Server Architecture. In future articles we will explore them further. Guest Author  The author of the article is Anil Kumar Yadav is Trainer, SQL Domain, Koenig Solutions. Koenig is a premier IT training firm that provides several IT certifications, such as Oracle 11g, Server+, RHCA, SQL Server Training, Prince2 Foundation etc. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Security, SQL Server, SQL Tips and Tricks, SQL Training, T SQL, Technology

    Read the article

  • DNS propogation question with name server change

    - by Brian
    The registrar I have is networksolutions.com, and for quite a while, the name servers were pointing to Site5.com, where hosting is for one of my domains. I wanted to bring DNS control back to networksolutions, so I pointed the name servers back to networksolutions and added in all my A records. However, I noticed that the site soon became unreachable. I'm curious as to why this happened? If the domain was pointing to either the old name servers or the new ones, it would still have the proper A records and whatnot. Is this because when I changed name servers, a request was made to delete them completely, and then the DNS servers worldwide have to wait for network solutions to send out the new ones or something? I was hoping this would be a switch with zero downtime, such as a normal A record change.

    Read the article

  • Foundation CSS Framework, how to change triangle on accodion [migrated]

    - by CreateSean
    I'm using foundation framework for the first time and for the most part everything is going smoothly. I am however having some trouble with the accordion in that I need to change the open/close indicator triangle that is in use. You can see it in the docs here. I've looked through the css and found the section with the accordion on foundation.css at lines 709-719 but there is no image to change or adjust. I would like to change this icon to the one in my psd, but just can't figure out where. See attached screenshot for what needs to be changed. I know how to make changes, in this case I just can't find where to make the change.

    Read the article

  • Exception:Cannot Start your application.The Workgroup information file is missing or opened exclusiv

    - by Jeev
    We were getting this error when trying   to connect  to a password protected access file. This is what the connection string looked likestring conString =@"Provider=Microsoft.Jet.OLEDB.4.0;Data Source="Path to your access file";User Id=;Password=password";To fix the issue this is what we didstring conString =@"Provider=Microsoft.Jet.OLEDB.4.0;Data Source="Path to your access file";Jet OLEDB:Database Password=password";  We removed the User id and changed the password to Jet OLEDB:Database Password Hope this helps someone   

    Read the article

  • Creating an ASP.NET report using Visual Studio 2010 - Part 2

    - by rajbk
    We continue building our report in this three part series. Creating an ASP.NET report using Visual Studio 2010 - Part 1 Creating an ASP.NET report using Visual Studio 2010 - Part 3 Creating the Client Report Definition file (RDLC) Add a folder called “RDLC”. This will hold our RDLC report.   Right click on the RDLC folder, select “Add new item..” and add an “RDLC” name of “Products”. We will use the “Report Wizard” to walk us through the steps of creating the RDLC.   In the next dialog, give the dataset a name called “ProductDataSet”. Change the data source to “NorthwindReports.DAL” and select “ProductRepository(GetProductsProjected)”. The fields that are returned from the method are shown on the right. Click next.   Drag and drop the ProductName, CategoryName, UnitPrice and Discontinued into the Values container. Note that you can create much more complex grouping using this UI. Click Next.   Most of the selections on this screen are grayed out because we did not choose a grouping in the previous screen. Click next. Choose a style for your report. Click next. The report graphic design surface is now visible. Right click on the report and add a page header and page footer. With the report design surface active, drag and drop a TextBox from the tool box to the page header. Drag one more textbox to the page header. We will use the text boxes to add some header text as shown in the next figure. You can change the font size and other properties of the textboxes using the formatting tool bar (marked in red). You can also resize the columns by moving your cursor in between columns and dragging. Adding Expressions Add two more text boxes to the page footer. We will use these to add the time the report was generated and page numbers. Right click on the first textbox in the page footer and select “Expression”. Add the following expression for the print date (note the = sign at the left of the expression in the dialog below) "© Northwind Traders " & Format(Now(),"MM/dd/yyyy hh:mm tt") Right click on the second text box and add the following for the page count.   Globals.PageNumber & " of " & Globals.TotalPages Formatting the page footer is complete.   We are now going to format the “Unit Price” column so it displays the number in currency format.  Right click on the [UnitPrice] column (not header) and select “Text Box Properties..” Under “Number”, select “Currency”. Hit OK. Adding a chart With the design surface active, go to the toolbox and drag and drop a chart control. You will need to move the product list table down first to make space for the chart contorl. The document can also be resized by dragging on the corner or at the page header/footer separator. In the next dialog, pick the first chart type. This can be changed later if needed. Click OK. The chart gets added to the design surface.   Click on the blue bars in the chart (not legend). This will bring up drop locations for dropping the fields. Drag and drop the UnitPrice and CategoryName into the top (y axis) and bottom (x axis) as shown below. This will give us the total unit prices for a given category. That is the best I could come up with as far as what report to render, sorry :-) Delete the legend area to get more screen estate. Resize the chart to your liking. Change the header, x axis and y axis text by double clicking on those areas. We made it this far. Let’s impress the client by adding a gradient to the bar graph :-) Right click on the blue bar and select “Series properties”. Under “Fill”, add a color and secondary color and select the Gradient style. We are done designing our report. In the next section you will see how to add the report to the report viewer control, bind to the data and make it refresh when the filter criteria are changed.   Creating an ASP.NET report using Visual Studio 2010 - Part 3

    Read the article

  • Debugging Tips for Skinning

    - by Christian David Straub
    Another guest post by Jeanne Waldman.If you are developing a skin for your Fusion Application in JDeveloper you should know these tips.   1. Firebug is your friend 2. Uncompress the css style classes 3. CHECK_FILE_MODIFICATION so that you see your skinning changes right away 4. View the generated CSS File   1. Firebug is your friend Install Firebug (http://getfirebug.com/layout) into Firefox and use it to view your rendered jspx page in the browser. You can select the HTML dom nodes on your page and you can see the css styles applied to each dom node.   2. Uncompress the css style classes By default the styleclasses that are rendered are compressed. You may see style classes like class="x10" and class="x2e". But in your skin css file you have styleclasses like: af|inputText::content or af|panelBox::header   It is easier for you to develop a skin and debug a skin with Firebug if you see the uncompressed styleclasses. To do this, a. open web.xml b. add   <context-param>     <param-name>org.apache.myfaces.trinidad.DISABLE_CONTENT_COMPRESSION</param-name>     <param-value>true</param-value>   </context-param> c. save d. restart the server and re-run your page.   3. CHECK_FILE_MODIFICATION so that you see your skinning changes right away   For performance sake the ADF Faces framework does not check if you skin .css file has changed on every render. But this is exactly what you want to happen when you are developing or debugging a skin. You want your changes to get noticed right away, without restarting the server.   To do this, a. open web.xml b. add   <context-param>     <description>If this parameter is true, there will be an automatic check of the modification date of your JSPs, and saved state will be discarded when JSP's change. It will also automatically check if your skinning css files have changed without you having to restart the server. This makes development easier, but adds overhead. For this reason this parameter should be set to false when your application is deployed.</description>     <param-name>org.apache.myfaces.trinidad.CHECK_FILE_MODIFICATION</param-name>     <param-value>false</param-value>   </context-param> c. save d. restart the server and re-run your page. e. from then on, you can change your skin's .css file, save it and refresh your page and you should see the changes right away   4. View the generated CSS File   There are different ways to view the generated CSS File which is your skin's css file merged in with all the skins it extends and processed and generated to the filesystem and linked to your generated html page. One way is to view it with Firebug. The problem with this approach is you might see something that is a little different than the actual css file because Firebug may do some extra processing. I like to view the generated css file by: Right click on your page in the browser 

    Read the article

  • ca-certificates-java fails when trying to install openjdk-6-jre

    - by Jonas
    I use a VPS with Ubuntu Server 10.10 x64. I want to use Java and run the command sudo apt-get install openjdk-6-jre but it fails because the installation encounted errors while processing ca-certificates-java. I have tried to install the failed package with: sudo apt-get install ca-certificates-java How can I solve this? I have run sudo apt-get update and sudo apt-get upgrade but I get the same errors after that. I have also installed Ubuntu Server x64 on a VirtualBox, but the two Ubuntu Server 10.10 has different kernel versions (2.6.35 on VirtualBox and 2.6.18 on my VPS). And on VirtualBox I can install Jetty without any problems. The VPS is a fresh install of Ubuntu Server 10.10 x64, the first command I was running was sudo apt-get install openjdk-6-jre. When I run sudo apt-get install ca-certificates-java I get this message: Reading package lists... Done Building dependency tree Reading state information... Done ca-certificates-java is already the newest version. 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. 1 not fully installed or removed. After this operation, 0B of additional disk space will be used. Do you want to continue [Y/n]? Here I press Y then I get this message: Setting up ca-certificates-java (20100412) ... creating /etc/ssl/certs/java/cacerts... error adding brasil.gov.br/brasil.gov.br.crt error adding cacert.org/cacert.org.crt error adding debconf.org/ca.crt error adding gouv.fr/cert_igca_dsa.crt error adding gouv.fr/cert_igca_rsa.crt error adding mozilla/ABAecom_=sub.__Am._Bankers_Assn.=_Root_CA.crt error adding mozilla/AOL_Time_Warner_Root_Certification_Authority_1.crt error adding mozilla/AOL_Time_Warner_Root_Certification_Authority_2.crt error adding mozilla/AddTrust_External_Root.crt error adding mozilla/AddTrust_Low-Value_Services_Root.crt error adding mozilla/AddTrust_Public_Services_Root.crt error adding mozilla/AddTrust_Qualified_Certificates_Root.crt error adding mozilla/America_Online_Root_Certification_Authority_1.crt error adding mozilla/America_Online_Root_Certification_Authority_2.crt error adding mozilla/Baltimore_CyberTrust_Root.crt error adding mozilla/COMODO_Certification_Authority.crt error adding mozilla/COMODO_ECC_Certification_Authority.crt error adding mozilla/Camerfirma_Chambers_of_Commerce_Root.crt error adding mozilla/Camerfirma_Global_Chambersign_Root.crt error adding mozilla/Certplus_Class_2_Primary_CA.crt error adding mozilla/Certum_Root_CA.crt error adding mozilla/Comodo_AAA_Services_root.crt error adding mozilla/Comodo_Secure_Services_root.crt error adding mozilla/Comodo_Trusted_Services_root.crt error adding mozilla/DST_ACES_CA_X6.crt error adding mozilla/DST_Root_CA_X3.crt error adding mozilla/DigiCert_Assured_ID_Root_CA.crt error adding mozilla/DigiCert_Global_Root_CA.crt error adding mozilla/DigiCert_High_Assurance_EV_Root_CA.crt error adding mozilla/DigiNotar_Root_CA.crt error adding mozilla/Digital_Signature_Trust_Co._Global_CA_1.crt error adding mozilla/Digital_Signature_Trust_Co._Global_CA_2.crt error adding mozilla/Digital_Signature_Trust_Co._Global_CA_3.crt error adding mozilla/Digital_Signature_Trust_Co._Global_CA_4.crt error adding mozilla/Entrust.net_Global_Secure_Personal_CA.crt error adding mozilla/Entrust.net_Global_Secure_Server_CA.crt error adding mozilla/Entrust.net_Premium_2048_Secure_Server_CA.crt error adding mozilla/Entrust.net_Secure_Personal_CA.crt error adding mozilla/Entrust.net_Secure_Server_CA.crt error adding mozilla/Entrust_Root_Certification_Authority.crt error adding mozilla/Equifax_Secure_CA.crt error adding mozilla/Equifax_Secure_Global_eBusiness_CA.crt error adding mozilla/Equifax_Secure_eBusiness_CA_1.crt error adding mozilla/Equifax_Secure_eBusiness_CA_2.crt error adding mozilla/Firmaprofesional_Root_CA.crt error adding mozilla/GTE_CyberTrust_Global_Root.crt error adding mozilla/GTE_CyberTrust_Root_CA.crt error adding mozilla/GeoTrust_Global_CA.crt error adding mozilla/GeoTrust_Global_CA_2.crt error adding mozilla/GeoTrust_Primary_Certification_Authority.crt error adding mozilla/GeoTrust_Universal_CA.crt error adding mozilla/GeoTrust_Universal_CA_2.crt error adding mozilla/GlobalSign_Root_CA.crt error adding mozilla/GlobalSign_Root_CA_-_R2.crt error adding mozilla/Go_Daddy_Class_2_CA.crt error adding mozilla/IPS_CLASE1_root.crt error adding mozilla/IPS_CLASE3_root.crt error adding mozilla/IPS_CLASEA1_root.crt error adding mozilla/IPS_CLASEA3_root.crt error adding mozilla/IPS_Chained_CAs_root.crt error adding mozilla/IPS_Servidores_root.crt error adding mozilla/IPS_Timestamping_root.crt error adding mozilla/NetLock_Business_=Class_B=_Root.crt error adding mozilla/NetLock_Express_=Class_C=_Root.crt error adding mozilla/NetLock_Notary_=Class_A=_Root.crt error adding mozilla/NetLock_Qualified_=Class_QA=_Root.crt error adding mozilla/Network_Solutions_Certificate_Authority.crt error adding mozilla/QuoVadis_Root_CA.crt error adding mozilla/QuoVadis_Root_CA_2.crt error adding mozilla/QuoVadis_Root_CA_3.crt error adding mozilla/RSA_Root_Certificate_1.crt error adding mozilla/RSA_Security_1024_v3.crt error adding mozilla/RSA_Security_2048_v3.crt error adding mozilla/SecureTrust_CA.crt error adding mozilla/Secure_Global_CA.crt error adding mozilla/Security_Communication_Root_CA.crt error adding mozilla/Sonera_Class_1_Root_CA.crt error adding mozilla/Sonera_Class_2_Root_CA.crt error adding mozilla/Staat_der_Nederlanden_Root_CA.crt error adding mozilla/Starfield_Class_2_CA.crt error adding mozilla/StartCom_Certification_Authority.crt error adding mozilla/StartCom_Ltd..crt error adding mozilla/SwissSign_Gold_CA_-_G2.crt error adding mozilla/SwissSign_Platinum_CA_-_G2.crt error adding mozilla/SwissSign_Silver_CA_-_G2.crt error adding mozilla/Swisscom_Root_CA_1.crt error adding mozilla/TC_TrustCenter__Germany__Class_2_CA.crt error adding mozilla/TC_TrustCenter__Germany__Class_3_CA.crt error adding mozilla/TDC_Internet_Root_CA.crt error adding mozilla/TDC_OCES_Root_CA.crt error adding mozilla/TURKTRUST_Certificate_Services_Provider_Root_1.crt error adding mozilla/TURKTRUST_Certificate_Services_Provider_Root_2.crt error adding mozilla/Taiwan_GRCA.crt error adding mozilla/Thawte_Personal_Basic_CA.crt error adding mozilla/Thawte_Personal_Freemail_CA.crt error adding mozilla/Thawte_Personal_Premium_CA.crt error adding mozilla/Thawte_Premium_Server_CA.crt error adding mozilla/Thawte_Server_CA.crt error adding mozilla/Thawte_Time_Stamping_CA.crt error adding mozilla/UTN-USER_First-Network_Applications.crt error adding mozilla/UTN_DATACorp_SGC_Root_CA.crt error adding mozilla/UTN_USERFirst_Email_Root_CA.crt error adding mozilla/UTN_USERFirst_Hardware_Root_CA.crt error adding mozilla/ValiCert_Class_1_VA.crt error adding mozilla/ValiCert_Class_2_VA.crt error adding mozilla/VeriSign_Class_3_Public_Primary_Certification_Authority_-_G5.crt error adding mozilla/Verisign_Class_1_Public_Primary_Certification_Authority.crt error adding mozilla/Verisign_Class_1_Public_Primary_Certification_Authority_-_G2.crt error adding mozilla/Verisign_Class_1_Public_Primary_Certification_Authority_-_G3.crt error adding mozilla/Verisign_Class_2_Public_Primary_Certification_Authority.crt error adding mozilla/Verisign_Class_2_Public_Primary_Certification_Authority_-_G2.crt error adding mozilla/Verisign_Class_2_Public_Primary_Certification_Authority_-_G3.crt error adding mozilla/Verisign_Class_3_Public_Primary_Certification_Authority.crt error adding mozilla/Verisign_Class_3_Public_Primary_Certification_Authority_-_G2.crt error adding mozilla/Verisign_Class_3_Public_Primary_Certification_Authority_-_G3.crt error adding mozilla/Verisign_Class_4_Public_Primary_Certification_Authority_-_G2.crt error adding mozilla/Verisign_Class_4_Public_Primary_Certification_Authority_-_G3.crt error adding mozilla/Verisign_RSA_Secure_Server_CA.crt error adding mozilla/Verisign_Time_Stamping_Authority_CA.crt error adding mozilla/Visa_International_Global_Root_2.crt error adding mozilla/Visa_eCommerce_Root.crt error adding mozilla/WellsSecure_Public_Root_Certificate_Authority.crt error adding mozilla/Wells_Fargo_Root_CA.crt error adding mozilla/XRamp_Global_CA_Root.crt error adding mozilla/beTRUSTed_Root_CA-Baltimore_Implementation.crt error adding mozilla/beTRUSTed_Root_CA.crt error adding mozilla/beTRUSTed_Root_CA_-_Entrust_Implementation.crt error adding mozilla/beTRUSTed_Root_CA_-_RSA_Implementation.crt error adding mozilla/thawte_Primary_Root_CA.crt error adding signet.pl/signet_ca1_pem.crt error adding signet.pl/signet_ca2_pem.crt error adding signet.pl/signet_ca3_pem.crt error adding signet.pl/signet_ocspklasa2_pem.crt error adding signet.pl/signet_ocspklasa3_pem.crt error adding signet.pl/signet_pca2_pem.crt error adding signet.pl/signet_pca3_pem.crt error adding signet.pl/signet_rootca_pem.crt error adding signet.pl/signet_tsa1_pem.crt error adding spi-inc.org/spi-ca-2003.crt error adding spi-inc.org/spi-cacert-2008.crt error adding telesec.de/deutsche-telekom-root-ca-2.crt failed (VM used: java-6-openjdk). dpkg: error processing ca-certificates-java (--configure): subprocess installed post-installation script returned error exit status 1 Errors were encountered while processing: ca-certificates-java E: Sub-process /usr/bin/dpkg returned an error code (1) Update I also get a problem when running java -version: Error occurred during initialization of VM Could not reserve enough space for object heap Could not create the Java virtual machine. My VPS had 128MB of Memory, I changed to 256MB but got the same problem. Then I changed to 512MB and got the same problem. I found a related post on a forum: Sub-process /usr/bin/dpkg returned an error code (1) And I tried: sudo apt-get clean sudo apt-get --reinstall install openjdk-6-jre sudo dpkg --configure -a But I got the same problem, even when I'm using 512MB of Memory. Any suggestions?

    Read the article

  • Migrate add-on domain olddomain.com to newdomain.com

    - by eHx
    I have 2 domains that are registered at GoDaddy : domaina.com (not hosted, only domain name is registered to GD) domainb.com (hosted at a different webhost, domain name registered to GD) domainb.com is an already working site, with a different webhost, but the domain name is registered to GoDaddy(and I assume the nameservers are changed to redirect to the webhost). Now, I don't understand why this was done, but domainb.com is considered a subdomain on the host... meaning the files are in a seperate folder on the server. Ex : public-html/domainb.com/public-html/FILES The structure is similar to this on the webhost : HostNAME (main root folder) domainb.com (subdomain of hostname) domainc.com (etc...) domaind.com (etc...) I want to transfer the site domainb.com to domaina.com, meaning domaina.com will become the new website, without having to re-upload all the content and CMS. The old one will redirect to domaina.com once the transfer is done (using 301 redirects). Can anyone tell me how I can do this?

    Read the article

  • Quick example: why coding standards must be in place

    - by DigiMortal
    One quick example why coding standards must be in place. Take a look at the following code – property names are changed but not anything else. public string Property1 { get; set; }   public string Property2 {     get;     set; }   public string Property3 {     get; set; } And yes – it is real-world example.

    Read the article

  • Introducing a new Umbraco datatype for Multi-lingual websites.

    - by Vizioz Limited
    Over the last 6 months we have been building various multi-lingual sites for different clients and for some of the clients they have 1 to 1 relationships between some or all of their pages.Within Umbraco, you can copy a page ( or whole tree of pages ) and keep a relationship between each of the pages and their new copy, this allows content editors to subscribe to change notifications that Umbraco can create if one of the linked pages is changed.Unfortunately one thing that is missing in Umbraco is any way to see which pages are related to each other and to have a quick and easy way to jump between the related pages.We created a datatype that solves these problems and thought we would release it as an open source project ( which we are still maintaining )Currently you can:1) See current relationships2) Add relationships3) Limit the number of relationships that can be added ( by the data type )4) See the Country flag ( assuming a culture has been set on each of your top level site nodes for each country site )5) Link between the documents6) Change or delete the linksAn example where multiple languages are allowed:An example where only 2 languages exist (1 relationship):You can download the datatype from the Umbraco project page:Vizioz Relationships for UmbracoPlease do let us know what you think :)

    Read the article

  • Display date/time in Ubuntu lock screen (cinnamon-like)

    - by 3l4ng
    I was using Ubuntu 13.04 till a few days ago. I had installed Nemo and I guess cinnamon got installed too (I was still using Unity nonetheless). One of the good things about that was the lock screen somehow changed to mimick cinnamon's (with the background as a darkened version of desktop background and date/time displayed), something like this: I have no idea how this happened, but it looked great. It does not seem to work on Ubuntu 13.10. Anyone has any ideas to get a similar lock screen? I just need to display the date/time on one side with the bg as a darkened version of my desktop wallpaper

    Read the article

  • Good ol fashioned debugging

    - by Tim Dexter
    I have been helping out one of our new customers over the last day or two and I have even managed to get to the bottom of their problem FTW! They use BIEE and BIP and wanted to mount a BIP report in a dashboard page, so far so good, BIP does that! Just follow the instructions in the BIEE user guide. The wrinkle is that they want to enter some fixed instruction strings into the dashboard prompts to help the user. These are added as fixed values to the prompt as the default values so they appear first. Once the user makes a selection, the default strings disappear. Its a fair requirement but the BIP report chokes Now, the BIP report had been setup with the Autorun checkbox, unchecked. I expected the BIP report to wait for the Go button to be hit but it was trying to run immediately and failing. That was the first issue. You can not stop the BIP report from trying to run in a dashboard. Even if the Autorun is turned off, it seems that dashboard still makes the request to BIP to run the report. Rather than BIP refusing because its waiting for input it goes ahead anyway, I guess the mechanism does not check the autorun flag when the request is coming from the dashboard. It appears that between BIEE and BIP, they collectively ignore the autorun flag. A bug? might be, at least an enhancement request. With that in mind, how could we get BIP to not at least not fail? This fact was stumping me on the parameter error, if the autorun flag was being respected then why was BIP complaining about the parameter values it should not even be doing anything until the Go button is clicked. I now knew that the autorun flag was being ignored, it was a simple case of putting BIP into debug mode. I use the OC4J server on my laptop so debug msgs are routed through the dos box used to start the OC4J container. When I changed a value on the dashboard prompt I spotted some debug text rushing by that subsequently disappeared from the log once the operation was complete. Another bug? I needed to catch that text as it went by, using the print screen function with some software to grab multiple screens as the log appeared and then disappeared. The upshot is that when you change the dashboard prompt value, BIP validates the value against its own LOVs, if its not in the list then it throws the error. Because 'Fill this first' and 'Fill this second' ie fixed strings from the dashboard prompts, are not in the LOV lists and because the report is auto running as soon as the dashboard page is brought up, the report complains about invalid parameters. To get around this, I needed to get the strings into the LOVs. Easily done with a UNION clause: select 'Fill this first' from SH.Products Products UNION select Products."Prod Category" as "Prod Category" from SH.Products Products Now when BIP wants to validate the prompt value, the LOV query fires and finds the fixed string -> No Error. No data, but definitely no errors :0) If users do run with the fixed values, you can capture that in the template. If there is no data in the report, either the fixed values were used or the parameters selected resulted in no rows. You can capture this in the template and display something like. 'Either your parameter values resulted in no data or you have not changed the default values' Thats the upside, the downside is that if your users run the report in the BP UI they re going to see the fixed strings. You could alleviate that by having BIP display the fixed strings in top of its parameter drop boxes (just set them as the default value for the parameter.) But they will not disappear like they do in the dashboard prompts, see below. If the expected autorun behaviour worked ie wait for the Go button, then we would not have to workaround it but for now, its a pretty good solution. It was an enjoyable hour or so for me, took me back to my developer daze, when we used to race each other for the most number of bug fixes. I used to run a distant 2nd behind 'Bugmeister Chen Hu' but led the chasing pack by a reasonable distance.

    Read the article

  • The CIO Identity Crisis — Can Cloud and Innovation Fix It?

    - by Dori DiMassimo-Oracle
    Featuring: Tom Fisher, CIO, Oracle Cloud Services Webcast Replay Now Available!   The simple fact is this: the emergence of cloud has fundamentally changed the role of the CIO; making job descriptions obsolete, altering organizational structures and changing the benchmarks of success. In this webcast Tom Fisher discussed how CIOs can effectively make the transition from "keepers of the technology" to "chief innovators" and how a managed cloud solution can help them regain control of this new, multi-sourced environment and all the business insight it brings.  Watch the webcast  and read Tom's white paper "The CIO as Chief Innovation Officer:  How Cloud is Changing the CIO Role"

    Read the article

  • JIT compiler for C, C++, and the likes

    - by Ebrahim
    Is there any just-in-time compiler out there for compiled languages, such as C and C++? (The first names that come to mind are Clang and LLVM! But I don't think they currently support it.) Explanation: I think the software could benefit from runtime profiling feedback and aggressively optimized recompilation of hotspots at runtime, even for compiled-to-machine languages like C and C++. Profile-guided optimization does a similar job, but with the difference a JIT would be more flexible in different environments. In PGO you run your binary prior to releasing it. After you released it, it would use no environment/input feedbacks collected at runtime. So if the input pattern is changed, it is probe to performance penalty. But JIT works well even in that conditions. However I think it is controversial wether the JIT compiling performance benefit outweights its own overhead. Edit: Grammar

    Read the article

  • Ubuntu Software Center does not allow changes to software sources

    - by Michael Goldshteyn
    The checkboxes that appear to be changeable under the Edit / Software Sources dialog box cannot be changed. I click on them and they just turn gray and stay at their current setting. Update: When I run software-center from a terminal window and try to change one of the checkbox settings, I get: Traceback (most recent call last): File "/usr/lib/python2.7/dist-packages/softwareproperties/gtk/SoftwarePropertiesGtk.py", line 649, in on_isv_source_toggled self.backend.ToggleSourceUse(str(source_entry)) File "/usr/lib/python2.7/dist-packages/dbus/proxies.py", line 143, in __call__ **keywords) File "/usr/lib/python2.7/dist-packages/dbus/connection.py", line 630, in call_blocking message, timeout) dbus.exceptions.DBusException: com.ubuntu.SoftwareProperties.PermissionDeniedByPolicy: com.ubuntu.softwareproperties.applychanges These things happen instead of it properly prompting me for a password (for root privs).

    Read the article

  • Improving the state of the art in API documentation sites

    - by Daniel Cazzulino
    Go straight to the site if you want: http://nudoq.org. You can then come back and continue reading :) Compare some of the most popular NuGet packages API documentation sites: Json.NET EntityFramework NLog Autofac You see the pattern? Huge navigation tree views, static content with no comments/community content, very hard (if not impossible) to search/filter, etc. These are the product of automated tools that have been developed years ago, in a time where CHM help files were common and even expected from libraries. Nowadays, most of the top packages in NuGet.org don’t even provide an online documentation site at all: it’s such a hassle for such a crappy user experience in the end! Good news is that it doesn’t have to be that way. Introducing NuDoq A lot has changed since those early days of .NET. We now have NuGet packages and the awesome channel that is ...Read full article

    Read the article

  • AjaxToolKit - ModalPopup conflicts with the current stable version

    - by Guilherme Cardoso
    For those who are using the current stable version of AjaxToolKit (40 412) will face an annoying problem. The ModalPopup cannot be used together with an UpdatePanel and one textbox. This is because during the postback, the value of the textbox will be changed and added a comma. That is, if we use a ModalPopup to have a form and then we add a database, the value "Antonio" in the text box will be entered as "Antonio." I've been researching a bit and this bug is even launching this version. To counter this I'm using the stable 30 512 May 2009 - http://ajaxcontroltoolkit.codeplex.com/releases/view/27326 One detail I noticed was that the ModalPopup OkControlID on the property. If we define the event does not occur (example: If this button is add the data to the database, the panel controlled by ModalPopup disappears but not to add the event occurs). Therefore it is best not declare anything on this property, it works without problems the same.

    Read the article

  • CRM@Oracle Series: Forecasting

    - by tony.berk
    What do you trust more: the weather forecast or your sales forecast? I hope the answer is your sales forecast! Either way, would your sales forecast be more accurate if sales management had visibility into what the sales reps are forecasting and what has changed since the last forecast? What if management could adjust forecasts for accuracy based on analytic tools? Today's slidecast discusses sales forecasting and how Oracle implemented forecasting in our global implementation of Siebel CRM, including the steps involved to roll up the forecast. CRM@Oracle - Forecasting Click here to learn more about Oracle CRM products and here to learn about other customers using Oracle CRM. Are you enjoying the CRM@Oracle Series? If you have a particular CRM area or function which you'd like to hear how Oracle implemented it internally, let us know and we'll get it on our list.

    Read the article

< Previous Page | 85 86 87 88 89 90 91 92 93 94 95 96  | Next Page >