Search Results

Search found 8523 results on 341 pages for 'bobby tables'.

Page 199/341 | < Previous Page | 195 196 197 198 199 200 201 202 203 204 205 206  | Next Page >

  • A Technical Perspective On Rapid Planning

    - by Robert Story
    Upcoming WebcastTitle: Strategic Network Optimization - One Solution for Many Problems!Date: April 14, 2010 Time: 11:00 am EDT, 9:00 am MDT, 8:00 am PDT, 16:00 GMT Product Family: Value Chain PlanningSummary This one-hour session is recommended for System Administrators, Database Administrators, and Technical Users seeking a general overview of Rapid Planning, installation issues, and debug information. This webcast is intended to provide users with insight into known issues, and an overview of the debugging possibilities for Rapid Planning. Topics will include: Benefits of using simulation planning Installing Oracle Rapid planning, points to be aware of Relevant tables Rapid planning log files Information needed by supportA short, live demonstration (only if applicable) and question and answer period will be included. Click here to register for this session....... ....... ....... ....... ....... ....... .......The above webcast is a service of the E-Business Suite Communities in My Oracle Support.For more information on other webcasts, please reference the Oracle Advisor Webcast Schedule.Click here to visit the E-Business Communities in My Oracle Support Note that all links require access to My Oracle Support.

    Read the article

  • Sync Google Contacts with QuickBooks

    - by dataintegration
    The RSSBus ADO.NET Providers offer an easy way to integrate with different data sources. In this article, we include a fully functional application that can be used to synchronize contacts between Google and QuickBooks. Like our QuickBooks ADO.NET Provider, the included application supports both the desktop versions of QuickBooks and QuickBooks Online Edition. Getting the Contacts Step 1: Google accounts include a number of contacts. To obtain a list of a user's Google Contacts, issue a query to the Contacts table. For example: SELECT * FROM Contacts. Step 2: QuickBooks stores contact information in multiple tables. Depending on your use case, you may want to synchronize your Google Contacts with QuickBooks Customers, Employees, Vendors, or a combination of the three. To get data from a specific table, issue a SELECT query to that table. For example: SELECT * FROM Customers Step 3: Retrieving all results from QuickBooks may take some time, depending on the size of your company file. To narrow your results, you may want to use a filter by including a WHERE clause in your query. For example: SELECT * FROM Customers WHERE (Name LIKE '%James%') AND IncludeJobs = 'FALSE' Synchronizing the Contacts Synchronizing the contacts is a simple process. Once the contacts from Google and the customers from QuickBooks are available, they can be compared and synchronized based on user preference. The sample application does this based on user input, but it is easy to create one that does the synchronization automatically. The INSERT, UPDATE, and DELETE statements available in both data providers makes it easy to create, update, or delete contacts in either data source as needed. Pre-Built Demo Application The executable for the demo application can be downloaded here. Note that this demo is built using BETA builds of the ADO.NET Provider for Google V2 and ADO.NET Provider for QuickBooks V3, and will expire in 2013. Source Code You can download the full source of the demo application here. You will need the Google ADO.NET Data Provider V2 and the QuickBooks ADO.NET Data Provider V3, which can be obtained here.

    Read the article

  • CodePlex Daily Summary for Tuesday, December 21, 2010

    CodePlex Daily Summary for Tuesday, December 21, 2010Popular ReleasesSQL Monitor - tracking sql server activities: SQL Monitor 3.0 alpha 7: 1. added script save/load in user query window 2. fixed problem with connection dialog when choosing windows auth but still ask for user name 3. auto open user table when double click one table node 4. improved alert message, added log only methodOpen NFe: Open NFe 2.0 (Beta): Última versão antes da versão final a ser lançada nos próximos dias.EnhSim: EnhSim 2.2.6 ALPHA: 2.2.6 ALPHAThis release supports WoW patch 4.03a at level 85 To use this release, you must have the Microsoft Visual C++ 2010 Redistributable Package installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=A7B7A05E-6DE6-4D3A-A423-37BF0912DB84 To use the GUI you must have the .NET 4.0 Framework installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=9cfb2d51-5ff4-4491-b0e5-b386f32c0992 - Fixing up some r...LINQ to Twitter: LINQ to Twitter Beta v2.0.18: Silverlight, OAuth, 100% Twitter API coverage, streaming, extensibility via Raw Queries, and added documentation. Bug fixes.ASP.NET MVC Project Awesome (jQuery Ajax helpers): 1.4.3: Helpers (controls) that you can use to build highly responsive and interactive Ajax-enabled Web applications. These helpers include Autocomplete, AjaxDropdown, Lookup, Confirm Dialog, Popup Form, Popup and Pager new stuff: Improvements for confirm, popup, popup form RenderView controller extension the user experience for crud in live demo has been substantially improved + added search all the features are shown in the live demoGanttPlanner: GanttPlanner V1.0: GanttPlanner V1.0 include GanttPlanner.dll and also a Demo application.N2 CMS: 2.1 release candidate 3: * Web platform installer support available N2 is a lightweight CMS framework for ASP.NET. It helps you build great web sites that anyone can update. Major Changes Support for auto-implemented properties ({get;set;}, based on contribution by And Poulsen) A bunch of bugs were fixed File manager improvements (multiple file upload, resize images to fit) New image gallery Infinite scroll paging on news Content templates First time with N2? Try the demo site Download one of the templ...TweetSharp: TweetSharp v2.0.0.0 - Preview 6: Documentation for this release may be found at http://tweetsharp.codeplex.com/wikipage?title=UserGuide&referringTitle=Documentation. Note: This code is currently preview quality. Preview 6 ChangesMaintenance release with user reported fixes Preview 5 ChangesMaintenance release with user reported fixes Preview 4 ChangesReintroduced fluent interface support via satellite assembly Added entities support, entity segmentation, and ITweetable/ITweeter interfaces for client development Numer...Team Foundation Server Administration Tool: 2.1: TFS Administration Tool 2.1, is the first version of the TFS Administration Tool which is built on top of the Team Foundation Server 2010 object model. TFS Administration Tool 2.1 can be installed on machines that are running either Team Explorer 2010, or Team Foundation Server 2010.Hacker Passwords: HackerPasswords.zip: Source code, executable and documentationSubtitleTools: SubtitleTools 1.3: - Added .srt FileAssociation & Win7 ShowRecentCategory feature. - Applied UnifiedYeKe to fix Persian search problems. - Reduced file size of Persian subtitles for uploading @OSDB.Facebook C# SDK: 4.1.0: - Lots of bug fixes - Removed Dynamic Runtime Language dependencies from non-dynamic platforms. - Samples included in release for ASP.NET, MVC, Silverlight, Windows Phone 7, WPF, WinForms, and one Visual Basic Sample - Changed internal serialization to use Json.net - BREAKING CHANGE: Canvas Session is no longer support. Use Signed Request instead. Canvas Session has been deprecated by Facebook. - BREAKING CHANGE: Some renames and changes with Authorizer, CanvasAuthorizer, and Authorization ac...NuGet (formerly NuPack): NuGet 1.0 build 11217.102: Note: this release is slightly newer than RC1, and fixes a couple issues relating to updating packages to newer versions. NuGet is a free, open source developer focused package management system for the .NET platform intent on simplifying the process of incorporating third party libraries into a .NET application during development. This release is a Visual Studio 2010 extension and contains the the Package Manager Console and the Add Package Dialog. This new build targets the newer feed (h...WCF Community Site: WCF Web APIs 10.12.17: Welcome to the second release of WCF Web APIs on codeplex Here is what is new in this release. WCF Support for jQuery - create WCF web services that are easy to consume from JavaScript clients, in particular jQuery. Better support for using JsonValue as dynamic Support for JsonValue change notification events for databinding and other purposes Support for going between JsonValue and CLR types WCF HTTP - create HTTP / REST based web services. This is a minor release which contains fixe...LiveChat Starter Kit: LCSK v1.0: This is a working version of the LCSK for Visual Studio 2010, ASP.NET MVC 3 (using Razor View Engine). this is still provider based (with 1 provider Sql) and this is still using WebService and Windows Forms operator console. The solution is cleaner, with an installer to create tables etc. You can also install it via nuget (Install-Package lcsk) Let me know your feedbackOrchard Project: Orchard 0.9: Orchard Release Notes Build: 0.9.253 Published: 12/16/2010 How to Install OrchardTo install the Orchard tech preview using Web PI, follow these instructions: http://www.orchardproject.net/docs/Installing-Orchard-Using-Web-PI.ashx Web PI will detect your hardware environment and install the application. --OR-- Alternatively, to install the release manually, download the Orchard.Web.0.9.253.zip file. The zip contents are pre-built and ready-to-run. Simply extract the contents of the Orch...DotSpatial: DotSpatial 12-15-2010: This release contains a few minor bug fixes and hopefully the GDAL libraries for the 3.5 x86 build actually built to the correct directory this time.DotNetNuke® Community Edition: 05.06.01 Beta: This is the initial Beta of DotNetNuke 5.6.1. See the DotNetNuke Roadmap a full list of changes in this release.MSBuild Extension Pack: December 2010: Release Blog Post The MSBuild Extension Pack December 2010 release provides a collection of over 380 MSBuild tasks. A high level summary of what the tasks currently cover includes the following: System Items: Active Directory, Certificates, COM+, Console, Date and Time, Drives, Environment Variables, Event Logs, Files and Folders, FTP, GAC, Network, Performance Counters, Registry, Services, Sound Code: Assemblies, AsyncExec, CAB Files, Code Signing, DynamicExecute, File Detokenisation, GU...Silverlight Contrib: Silverlight Contrib 2010.1.0: 2010.1.0 New FeaturesCompatibility Release for Silverlight 4 and Visual Studio 2010New Projectsaqq2: aqq2aqq2aqq2aqq2aqq2aqq2aqq2aqq2aqq2aqq2aqq2aqq2aqq2aqq2aqq2aqq2aqq2aqq2aqq2aqq2Cameleon CMS: Caméléon, le CMS open-source le plus simple d'utilisation sur le marché.Dail Emulator: Dail Emulator is a software that emulates the server software of Aeonsofts game FlyFF. Specifically made for the now defunct version 6 of the game.Dmitry Lyalin's WP7 Sample Application: A project where im putting all my public samples in a single WP7 client test applicationElixir: DotNetNuke module test framework. Use this DLL to write concise, to-the-point tests for your modules that can target multiple DNN platforms and multiple browsers.FlightTableEditor: This edits the Flight and Healing Tables of Pokémon Fire Red and Pokémon Leaf Green.Genetico UNLaM: Algoritmo Genetico UNLaMHelpCentral HelpDesk Ticket Application: An open-ended framework and client for a helpdesk ticket application and bug tracker. Built to be as extensible and available as possible, the central system is a WCF webservice which can be accessed by any client application.Import Media for Umbraco: This package allows you to import multiple or large files that have been uploaded to the server via FTP or other means. A new media item is created in the media tree and the media is moved from the upload location to the new directory for the media item.LargeCMS full sample (How to call CryptMsg API in streaming mode): LargeCms implements SignedLargeCMS and EnvelopedLargeCMS classes, my own version of .NET SignedCms and EnvelopedCMS to work with large data. It shows a way to call CryptMsg API in streaming mode to overcome the limitations of current .NET classesLinq.Autocompiler: Adds runtime, on demand compilation of Linq queries (in Linq to SQL), which creates CompiledQuery instances and stores them in a cache. Mathetris: it's a game.MonoWiimote: Managed c# library for using Wii Remote device under Mono/Linux enviroment. It requires libcwiimote-0.4 (sudo apt-get install libcwiimote-0.4).Movie Captor: Concorrendo no Desafio .NETNSpotifyLib: A .NET library contains wrapper classes for the Spotify Metadata API. http://developer.spotify.com/en/metadata-api/overview/ This library supports searching for artists, albums and tracks by using the Spotify Metadata API. In the future I'll add support for Lookups as well.Silverhawk: It's developed in C# programming language.SimpleSpy: Detecting object handle on cursor move and get its information. Basically this is a bare bone program for detecting object handle at cursor and add your own logic to detect for further information. SunburnJitter: JitterSunburn contains helper classes and methods to use the jitter physics engine easier in your sunburn projects. Simple component based shapes will provide easy editing in editor. It's developed in C#.Sunrod: Sunrod is a .NET wrapper for the Obsidian Portal API.TouchStack - A MonoTouch ServiceStack.NET client: TouchStack makes it easier for MonoTouch developers to consume Web services created and exposed by the brilliant ServiceStack.NET webservice framework. TouchStack is written in C# and uses the MonoTouch library.VNTViewer: Viewer for notes/memos (VNT files)VpNet: Official .NET Wrapper for Virtual Paradise.xlGUI: Streamlet GUI Framework

    Read the article

  • How should I take being told that I was wrong?

    - by Chris
    On a fairly important project with short timelines I decided to use SubSonic for straight forward data access. I wired up a handful of forms, created matching database tables and POCO's for each and used SubSonic's simple repository mode for the data access. Everything worked well and I was able to bang these forms out pretty quickly and I moved on to other things. Since that time I have heard that using SubSonic was a 'cowboy move' and that it was implemented 'incorrectly' and that 'the person who used it, didn't even know how to use SubSonic'. What I would like to know is, how should I take this? There were and still are no standards for data access at this company, so there is no violation of a standard. The forms worked exactly as requested and saved the data to the database correctly. And with only spending a few days on the forms instead of weeks, saved a lot of time which was used for other functionality in the project. So in light of all of this, I am confused as to what was 'incorrect'. Am I missing something here? Thanks for your answers.

    Read the article

  • SQL SERVER – Answer – Value of Identity Column after TRUNCATE command

    - by pinaldave
    Earlier I had one conversation with reader where I almost got a headache. I suggest all of you to read it before continuing this blog post SQL SERVER – Reseting Identity Values for All Tables. I believed that he faced this situation because he did not understand the difference between SQL SERVER – DELETE, TRUNCATE and RESEED Identity. I wrote a follow up blog post explaining the difference between them. I asked a small question in the second blog post and I received many interesting comments. Let us go over the question and its answer here one more time. Here is the scenario to set up the puzzle. Create Table with Seed Identity = 11 Insert Value and Check Seed (it will be 11) Reseed it to 1 Insert Value and Check Seed (it will be 2) TRUNCATE Table Insert Value and Check Seed (it will be 11) Let us see the T-SQL Script for the same. USE [TempDB] GO -- Create Table CREATE TABLE [dbo].[TestTable]( [ID] [int] IDENTITY(11,1) NOT NULL, [var] [nchar](10) NULL ) ON [PRIMARY] GO -- Build sample data INSERT INTO [TestTable] VALUES ('val') GO -- Select Data SELECT * FROM [TestTable] GO -- Reseed to 1 DBCC CHECKIDENT ('TestTable', RESEED, 1) GO -- Build sample data INSERT INTO [TestTable] VALUES ('val') GO -- Select Data SELECT * FROM [TestTable] GO -- Truncate table TRUNCATE TABLE [TestTable] GO -- Build sample data INSERT INTO [TestTable] VALUES ('val') GO -- Select Data SELECT * FROM [TestTable] GO -- Question for you Here -- Clean up DROP TABLE [TestTable] GO Now let us see the output of three of the select statements. 1) First Select after create table 2) Second Select after reseed table 3) Third Select after truncate table The reason is simple: If the table contains an identity column, the counter for that column is reset to the seed value defined for the column. Reference: Pinal Dave (http://blog.sqlauthority.com)       Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLServer, T SQL, Technology

    Read the article

  • Combo/Input LOV displaying non-reference key value

    - by [email protected]
    Its a very common use-case of LOV that we want to diplay a non key value in the LOV but store the key value in the DB. I had to do the same in a sample application I was building. During implementation of this, I realized that there are multiple ways to achieve this.I am going to describe each of these below.Example : Lets take an example of our classic HR schema. I have 2 tables Employee and Department where Dno is the foreign key attribute in Employee that references Department table.I want to create a LOV for Deparment such that the List always displays Dname instead of Dno. However when I update it, it it should update the reference key Dno.To achieve this I had 3 alternative1) Approach 1 :Create a composite VO and add the attributes from Department into Employee using a join.Refer the blog http://andrejusb.blogspot.com/2009/11/defining-lov-on-reference-attribute-in.htmlPositives :1. Easy to implement and use.2. We can use this attribute directly in queries defined on new attribute i.e If i have to display this inside query panel.Negative : We have to create an additional Join on the VO.Ex:SELECT Employees.EMPLOYEE_ID,        Employees.FIRST_NAME,        Employees.LAST_NAME,        Employees.EMAIL,        Employees.PHONE_NUMBER,       Department.Dno,        Department.DnameFROM EMPLOYEES Employees, Department Department WHERE Employees.Dno = Department .Dno2) Approach 2 :

    Read the article

  • Is HTML5/WebGL performance unreliable on low-end Android tablets and phones?

    - by Boris van Schooten
    I've developed a couple of WebGL games, and am trying them out on Android. I found that they run very slowly on my tablet, however. For example, a game with 10 sprites or so runs as 5fps. I tried Chrome and CocoonJS, but they are comparably slow. I also tried other games, and even games with only 5 or so moving sprites are this slow. This seems inconsistent with reports from others, such as this benchmark. Typically, when people talk about HTML5 game performance, they mention well-known and higher-end phones and tables. While my 7" tablet is cheap (I believe it's a relabeled Allwinner tablet, apparently with the Mali 400 GPU), I found it generally has a good gaming performance. All the games I tried run smoothly. I also developed an OpenGL ES 2 demo with 200 shaded 3D objects, and it ran at 50fps. My suspicion is that many low-end and white-label devices may have unacceptable HTML5/WebGL support, which means there may be a large section of gamers you will not reach when you choose this as your platform. I've heard rumors about inconsistent performance of HTML5 and WebGL on different devices, but no clear picture emerges. I would like to hear if any of you have had similar experiences with HTML5 or WebGL, or whether I can find information about the percentage of devices I can expect to have decent performance.

    Read the article

  • multi user web game with scheduled processing?

    - by Rooq
    I have an idea for a game which I am in the process of designing, but I am struggling to establish if the way I plan to implement it is possible. The game is a text based sports management simulation. This will require players to take certain actions through a web browser which will interact with a database - adding/updating and selecting. Most of the code required to be executed at this point will be fairly straightforward. The main processing will take place by applications which are scheduled to run on the server at certain times. These apps will process transactions added by the players and also perform some automatic processing based on the game date. My plan was to use an SQL server database (at last count I require about 20 tables) and VB.net for all the coding (coming from a mainframe programming background this language is the simplist for me to get to grips with). I will also need a scheduling tool on the server. Can anyone tell me if what I am planning is feasible before I dive into the actual coding stage of my project?

    Read the article

  • E-Business Suite Proactive Support - Workflow Analyzer

    - by Alejandro Sosa
    Overview The Workflow Analyzer is a standalone, easy to run tool created to read, validate and troubleshoot Workflow components configuration as well as runtime. It identifies areas where potential problems may arise and based on set of best practices suggests the Workflow System Administrator what to do when such potential problems are found. This tool represents a proactive way to verify Workflow configuration and runtime data to prevent issues ahead of time before they may become of more considerable impact on a production environment. Installation Since it is standalone there are no pre-requisites and runs on Oracle E-Business applications from 11.5.10 onwards. It is installed in the back-end server and can be run directly from SQL*Plus. The output of this tool is written in a HTML file friendly formatted containing the following on both workflow Components configuration and Workflow Runtime data: Workflow-related database initialization parameters Relevant Oracle E-Business profile option values Workflow-owned concurrent programs schedule and Workflow components status Workflow notification mailer configuration and throughput via related queues and table Workflow-relevant recommended and critical one-off patches as well as current code level Workflow database footprint by reading Workflow run-time tables to identify aged processes not being purged. It also checks for large open and closed processes or unhealthy looping conditions in a workflow process, among other checks. See a sample of Workflow Analyzer's output here.  Besides performing the validations listed above, the Workflow Analyzer provides clarification on the issues it finds and refers the reader to specific Oracle MOS documents to address the findings or explains the condition for the reader to take proper action. How to get it? The Workflow Analyzer can be obtained from Oracle MOS Workflow Analyzer script for E-Business Suite Workflow Monitoring and Maintenance (Doc ID 1369938.1) and the supplemental note How to run EBS Workflow Analyzer Tool as a Concurrent Request (Doc ID 1425053.1) explains how to register and run this tool as a concurrent program. This way the report from the Workflow Analyzer can be submitted from the Application and its output can be seen from the application as well.

    Read the article

  • How would I implement this application idea?

    - by Mike Wills
    I am a D&D gamer and a developer that has mostly worked with ASP.NET applications professionally. I have written some chat bots in Node.js and I have only played a little with PHP but wrote nothing serious. I have had inspiration to create a site that allows a person to keep track of characters (aka the character sheet). I am thinking of using this as a learning opportunity to learn noSQL and to write a full javascript front-end. I want this application to save the value as I change it. So if I edit the armor class, it is saved immediately instead of waiting until I hit the submit button. I think that will make it easier to use while gaming and not losing anything because I forgot to save the change. I have never done anything like this. How do you implement this style of application? Is there a tutorial or howto to get me on the right path? While I would really like to use ASP.NET but I don't have a Windows server to publish on (and I really can't afford to pay for a service). What language that runs on Linux would work well for this type of application? Note: I feel noSQL would work in this case because of the sheer number of tables required to create something like this in SQL.

    Read the article

  • Which tools should be used for data migration between environments?

    - by Paula Speranza-Hadley
    Ø  With the Oracle Utilities Application Framework based products there are a number of tools provided that can be used to transfer data from one environment to another. Ø  There are three main tools that implementations use: §  ConfigLab - A configurable copy facility is metadata aware and therefore understands the relationships between objects and by invoking the relevant maintenance objects validates the data copied. This utility uses the object validation to help ensure data integrity. Basically it is a set of configuration tables and a set of batch jobs to perform the migration of data. §  Bundling - A configurable release management tool that allows exporting of Advanced Configuration Environment based objects (business services, business objects, UI Maps etc) from one environment to another. §  Blueprint - An Oracle Utilities Software Development Kit (SDK) based tool to import metadata from the development environment to your initial testing environment. The utility is command line based and basically uses a text based configuration file to drive the utility on the source and target sides. Ø  Each tool has a role in an implementation but you must be careful to use the right tool for the right job within an implementation. The suggestions are as follows: §  Only use the Blueprint tool for migrating data from your development platform to your initial test environment. The blueprint tool is not designed to move large amounts of data and certainly is risky, if not used correctly, and can potentially break the integrity of your data. §  The SDK provides the configuration data that it is used for (mainly meta-data). This should not be extended as, while it can perform data migration on any data, it is not efficient and risky for certain types of configuration data. Ø  Additional information can be found in the following whitepaper:  Oracle Utilities Application Framework - Release Management - Software Configuration Management on MyOracle.com

    Read the article

  • Oracle Develop Newbies

    - by Cassandra Clark
    tweetmeme_url = 'http://blogs.oracle.com/develop/2010/06/oracle_develop_newbies.html'; Share .FBConnectButton_Small{background-position:-5px -232px !important;border-left:1px solid #1A356E;} .FBConnectButton_Text{margin-left:12px !important ;padding:2px 3px 3px !important;} There are a number of us in the Oracle Technology Network team that came over from the Sun acquisition so we are true Oracle Develop "newbies."  We are boning up on Oracle history and thought it would be fun to test your knowledge too.  Below are a few Oracle history questions.  Post your answers in the comment section of the blog and if you answer all questions correctly you will be listed in the next post as an "Oracle Genius".  Feel free to turn the tables on your fellow blog readers by posting your own Oracle history questions.  If you stump the community we'll add your question to our next post as well.  Oracle History Quiz - In 2003, what Applications rival company did Oracle acquire?In which year was JDeveloper first released?In what language was Oracle v 1.0 written?What Oracle program is designed to recognize and reward members of the Oracle Technology and Applications communities for their contributions back to the Oracle community?What party event draws in nearly 4,000 attendees every year during Oracle OpenWorld, Oracle Develop and now JavaOne?See you in September! 

    Read the article

  • SQL SERVER – Validating Unique Columnname Across Whole Database

    - by pinaldave
    I sometimes come across very strange requirements and often I do not receive a proper explanation of the same. Here is the one of those examples. Asker: “Our business requirement is when we add new column we want it unique across current database.” Pinal: “Why do you have such requirement?” Asker: “Do you know the solution?” Pinal: “Sure I can come up with the answer but it will help me to come up with an optimal answer if I know the business need.” Asker: “Thanks – what will be the answer in that case.” Pinal: “Honestly I am just curious about the reason why you need your column name to be unique across database.” (Silence) Pinal: “Alright – here is the answer – I guess you do not want to tell me reason.” Option 1: Check if Column Exists in Current Database IF EXISTS (  SELECT * FROM sys.columns WHERE Name = N'NameofColumn') BEGIN SELECT 'Column Exists' -- add other logic END ELSE BEGIN SELECT 'Column Does NOT Exists' -- add other logic END Option 2: Check if Column Exists in Current Database in Specific Table IF EXISTS (  SELECT * FROM sys.columns WHERE Name = N'NameofColumn' AND OBJECT_ID = OBJECT_ID(N'tableName')) BEGIN SELECT 'Column Exists' -- add other logic END ELSE BEGIN SELECT 'Column Does NOT Exists' -- add other logic END I guess user did not want to share the reason why he had a unique requirement of having column name unique across databases. Here is my question back to you – have you faced a similar situation ever where you needed unique column name across a database. If not, can you guess what could be the reason for this kind of requirement?  Additional Reference: SQL SERVER – Query to Find Column From All Tables of Database Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL System Table, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Thick models Vs. Business Logic, Where do you draw the distinction?

    - by TokenMacGuy
    Today I got into a heated debate with another developer at my organization about where and how to add methods to database mapped classes. We use sqlalchemy, and a major part of the existing code base in our database models is little more than a bag of mapped properties with a class name, a nearly mechanical translation from database tables to python objects. In the argument, my position was that that the primary value of using an ORM was that you can attach low level behaviors and algorithms to the mapped classes. Models are classes first, and secondarily persistent (they could be persistent using xml in a filesystem, you don't need to care). His view was that any behavior at all is "business logic", and necessarily belongs anywhere but in the persistent model, which are to be used for database persistence only. I certainly do think that there is a distinction between what is business logic, and should be separated, since it has some isolation from the lower level of how that gets implemented, and domain logic, which I believe is the abstraction provided by the model classes argued about in the previous paragraph, but I'm having a hard time putting my finger on what that is. I have a better sense of what might be the API (which, in our case, is HTTP "ReSTful"), in that users invoke the API with what they want to do, distinct from what they are allowed to do, and how it gets done. tl;dr: What kinds of things can or should go in a method in a mapped class when using an ORM, and what should be left out, to live in another layer of abstraction?

    Read the article

  • SQL SERVER – Using MAXDOP 1 for Single Processor Query – SQL in Sixty Seconds #008 – Video

    - by pinaldave
    Today’s SQL in Sixty Seconds video is inspired from my presentation at TechEd India 2012 on Speed up! – Parallel Processes and Unparalleled Performance. There are always special cases when it is about SQL Server. There are always few queries which gives optimal performance when they are executed on single processor and there are always queries which gives optimal performance when they are executed on multiple processors. I will be presenting the how to identify such queries as well what are the best practices related to the same. In this quick video I am going to demonstrate if the query is giving optimal performance when running on single CPU how one can restrict queries to single CPU by using hint OPTION (MAXDOP 1). More on Errors: Difference Temp Table and Table Variable – Effect of Transaction Effect of TRANSACTION on Local Variable – After ROLLBACK and After COMMIT Debate – Table Variables vs Temporary Tables – Quiz – Puzzle – 13 of 31 I encourage you to submit your ideas for SQL in Sixty Seconds. We will try to accommodate as many as we can. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Database, Pinal Dave, PostADay, SQL, SQL Authority, SQL in Sixty Seconds, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, SQLServer, T SQL, Video

    Read the article

  • Session state provider and atomic operations

    - by vtortola
    Hi, I've been thinking about this and it is blowing my mind... How does a session state provider properly works internally? I mean, I tried to write a custom session state provider based on Azure Tables or Blobs, but quickly I realized that because there is no way to ensure an atomic operation or establish a lock, race conditions are suitable to happen when several web servers do operation on that shared information. I know that there is a SQL Server Session State Provider (SQLS-SSP) and people is happy with it, so I guess that it's using some kind of transaction isolation level in order to accomplish some degree of concurrent safety, like checking is the data is lock (a simple column), locking it if not and returning the data in an atomic operation, but is that so? what does happen if the data is lock? does it returns an error? block the call for a while? returns it in read-only fashion? Cloud computing paradigms could be somehow new, but webfarms have been here for a while, so as I'm pretty new on it... do you recommend any good lecture about the topic? Thanks.

    Read the article

  • Including BLOB images in your PDF Reports

    - by thatjeffsmith
    Earlier this year we walked through how to work with BLOBs in Oracle SQL Developer. So you already know how to INSERT, UPDATE and view the BLOBs stored in your tables. But now I want to show you how to include those images in your PDF reports. You know how to work with SQL Developer reports, right? No? OK, let’s do a quick run down memory lane then: How to Build a Bar Chart Child reports – click on parent record for on-the-fly children records Alright, so if you have a GRID report that contains a BLOB column, you have the option of including the BLOB contents when you create a PDF export: At design time, specify how you want the BLOB content to be treated when you export to PDF Note that you must specify the treatment of the BLOBs in the report design. You won’t be prompted when you launch the Export wizard dialog. When you open your PDF, there will be a link to the image. Click it. Click then confirm. It will launch the default image viewer on your machine. I hope your pictures are more excited than mine.

    Read the article

  • Which would be a better way to load data via ajax

    - by Mike
    I am using google maps and returning html/lat/long from my MySQL database Currently A user picks a business category e.g; "Video Production". an ajax call is sent to a CodeIgniter controller the Controller then queries the db, and returns the following data via JSON Lat/Long of the marker HTML for the popup window this is approximately 34 rows in the database across two tables per business the ajax call receives this data and then plots the marker along with the html onto the map The data that is returned from the controller is one big json object... This is done for all businesses that exist in the Video Production category (currently approx 40 businesses). As you can see, pulling this data for multiple categories (100s of businesses) can get very very taxing on the server. My question is Would it be more beneficial to modify the process flow as such: a user picks a business category e.g; "Video Production". an ajax call is sent to a CodeIgniter controller the controller then queries the database for the location base information lat/long level (used to change marker icon color) This would be a single row per business with several columns the ajax call receives this data and then plots the marker on the map when the user clicks a marker an ajax call is sent to a CodeIgniter Controller the controller queries the database for the HTML and additional data based on business_id and if not, what are some better suggestions to this problem? In summary this means rather than including the HTML and additional data along for each business, only submitting minimal location information and then re-query for that information when each business marker is clicked. Potential Downsides longer load times when a user clicks a marker icon more code?? more queries to the database

    Read the article

  • PHP MYSQL loop to check if LicenseID Values are contained in mysql DB [closed]

    - by Jasper
    I have some troubles to find the right loop to check if some values are contained in mysql DB. I'm making a software and I want to add license ID. Each user has x keys to use. Now when the user start the client, it invokes a PHP page that check if the Key sent in the POST method is stored in DB or not. If that key isn't store than I need to check the number of his keys. If it's than X I'll ban him otherwise i add the new keys in the DB. I'm new with PHP and MYSQL. I wrote this code and I would know if I can improve it. <?php $user = POST METHOD $licenseID = POST METHOD $resultLic= mysql_query("SELECT id , idUser , idLicense FROM license WHERE idUser = '$user'") or die(mysql_error()); $resultNumber = mysql_num_rows($resultLic); $keyFound = '0'; // If keyfound is 1 the key is stored in DB while ($rows = mysql_fetch_array($resultLic,MYSQL_BOTH)) { //this loop check if the $licenseID is stored in DB or not for($i=0; $i< $resultNumber ; i++) { if($rows['idLicense'] === $licenseID) { //Just for the debug echo("License Found"); $keyFound = '1'; break; } //If key isn't in DB and there are less than 3 keys the new key will be store in DB if($keyfound == '0' && $resultNumber < 3) { mysql_query( Update users set ...Store $licenseID in Table) } // Else mean that the user want user another generated key (from the client) in the DB and i will be ban (It's wrote in TOS terms that they cant use the software on more than 3 different station) else { mysql_query( update users set ban ='1'.....etc ); } } ?> I know that this code seems really bad so i would know how i can improve it. Someone Could give me any advice? I choose to have 2 tables: users where all information about the users is, with fields id, username, password and another table license with fields id, idUsername, idLicense (the last one store license that the software generate)

    Read the article

  • SQL SERVER – Follow up – Usage of $rowguid and $IDENTITY

    - by pinaldave
    The most common question I often receive is why do I blog? The answer is even simpler – I blog because I get an extremely constructive comment and conversation from people like DHall and Kumar Harsh. Earlier this week, I shared a conversation between Madhivanan and myself regarding how to find out if a table uses ROWGUID or not? I encourage all of you to read the conversation here: SQL SERVER – Identifying Column Data Type of uniqueidentifier without Querying System Tables. In simple words the conversation between Madhivanan and myself brought out a simple query which returns the values of the UNIQUEIDENTIFIER  without knowing the name of the column. David Hall wrote few excellent comments as a follow up and every SQL Enthusiast must read them first, second and third. David is always with positive energy, he first of all shows the limitation of my solution here and here which he follows up with his own solution here. As he said his solution is also not perfect but it indeed leaves learning bites for all of us – worth reading if you are interested in unorthodox solutions. Kumar Harsh suggested that one can also find Identity Column used in the table very similar way using $IDENTITY. Here is how one can do the same. DECLARE @t TABLE ( GuidCol UNIQUEIDENTIFIER DEFAULT newsequentialid() ROWGUIDCOL, IDENTITYCL INT IDENTITY(1,1), data VARCHAR(60) ) INSERT INTO @t (data) SELECT 'test' INSERT INTO @t (data) SELECT 'test1' SELECT $rowguid,$IDENTITY FROM @t There are alternate ways also to find an identity column in the database as well. Following query will give a list of all column names with their corresponding tablename. SELECT SCHEMA_NAME(so.schema_id) SchemaName, so.name TableName, sc.name ColumnName FROM sys.objects so INNER JOIN sys.columns sc ON so.OBJECT_ID = sc.OBJECT_ID AND sc.is_identity = 1 Let me know if you use any alternate method related to identity, I would like to know what you do and how you do when you have to deal with Identity Column. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLServer, T SQL, Technology

    Read the article

  • CodePlex Daily Summary for Monday, June 24, 2013

    CodePlex Daily Summary for Monday, June 24, 2013Popular ReleasesVG-Ripper & PG-Ripper: VG-Ripper 2.9.43: changes NEW: Added Support for "ImageJumbo.com" links NEW: Added Support for "ImgTiger.com" links NEW: Added Support for "ImageDax.net" links NEW: Added Support for "HosterBin.com" links FIXED: "ImgServe.net" linksWPF Composites: Version 4.3.0: In this Beta release, I broke my code out into two separate projects. There is a core FasterWPF.dll with the minimal required functionality. This can run with only the Aero.dll and the Rx .dll's. Then, I have a FasterWPFExtras .dll that requires and supports the Extended WPF Toolkit™ Community Edition V 1.9.0 (including Xceed DataGrid) and the Thriple .dll. This is for developers who want more . . . Finally, you may notice the other OPTIONAL .dll's available in the download such as the Dyn...Windows.Forms.Controls Revisited (SSTA.WinForms): SSTA.WinForms 1.0.0: Latest stable releaseAscend 3D: Ascend 2.0: Release notes: Implemented bone/armature animation Refactored class hierarchy and naming Addressed high CPU usage issue during animation Updated the Blender exporter and Ascend model format (now XML) Created AscendViewer, a tool for viewing Ascend modelsIndent Guides for Visual Studio: Indent Guides v13: ImportantThis release does not support Visual Studio 2010. The latest stable release for VS 2010 is v12.1. Version History Changed in v13 Added page width guide lines Added guide highlighting options Fixed guides appearing over collapsed blocks Fixed guides not appearing in newly opened files Fixed some potential crashes Fixed lines going through pragma statements Various updates for VS 2012 and VS 2013 Removed VS 2010 support Changed in v12.1: Fixed crash when unable to start...Fluent Ribbon Control Suite: Fluent Ribbon Control Suite 2.1.0 - Prerelease d: Fluent Ribbon Control Suite 2.1.0 - Prerelease d(supports .NET 3.5, 4.0 and 4.5) Includes: Fluent.dll (with .pdb and .xml) Showcase Application Samples (not for .NET 3.5) Foundation (Tabs, Groups, Contextual Tabs, Quick Access Toolbar, Backstage) Resizing (ribbon reducing & enlarging principles) Galleries (Gallery in ContextMenu, InRibbonGallery) MVVM (shows how to use this library with Model-View-ViewModel pattern) KeyTips ScreenTips Toolbars ColorGallery *Walkthrough (do...Magick.NET: Magick.NET 6.8.5.1001: Magick.NET compiled against ImageMagick 6.8.5.10. Breaking changes: - MagickNET.Initialize has been made obsolete because the ImageMagick files in the directory are no longer necessary. - MagickGeometry is no longer IDisposable. - Renamed dll's so they include the platform name. - Image profiles can now only be accessed and modified with ImageProfile classes. - Renamed DrawableBase to Drawable. - Removed Args part of PathArc/PathCurvetoArgs/PathQuadraticCurvetoArgs classes. The...Three-Dimensional Maneuver Gear for Minecraft: TDMG 1.1.0.0 for 1.5.2: CodePlex???(????????) ?????????(???1/4) ??????????? ?????????? ???????????(??????????) ??????????????????????? ↑????、?????????????????????(???????) ???、??????????、?????????????????????、????????1.5?????????? Shift+W(????)??????????????????10°、?10°(?????????)???Hyper-V Management Pack Extensions 2012: HyperVMPE2012 (v1.0.1.126): Hyper-V Management Pack Extensions 2012 Beta ReleaseOutlook 2013 Add-In: Email appointments: This new version includes the following changes: - Ability to drag emails to the calendar to create appointments. Will gather all the recipients from all the emails and create an appointment on the day you drop the emails, with the text and subject of the last selected email (if more than one selected). - Increased maximum of numbers to display appointments to 30. You will have to uninstall the previous version (add/remove programs) if you had installed it before. Before unzipping the file...Caliburn Micro: WPF, Silverlight, WP7 and WinRT/Metro made easy.: Caliburn.Micro v1.5.2: v1.5.2 - This is a service release. We've fixed a number of issues with Tasks and IoC. We've made some consistency improvements across platforms and fixed a number of minor bugs. See changes.txt for details. Packages Available on Nuget Caliburn.Micro – The full framework compiled into an assembly. Caliburn.Micro.Start - Includes Caliburn.Micro plus a starting bootstrapper, view model and view. Caliburn.Micro.Container – The Caliburn.Micro inversion of control container (IoC); source code...SQL Compact Query Analyzer: 1.0.1.1511: Beta build of SQL Compact Query Analyzer Bug fixes: - Resolved issue where the application crashes when loading a database that contains tables without a primary key Features: - Displays database information (database version, filename, size, creation date) - Displays schema summary (number of tables, columns, primary keys, identity fields, nullable fields) - Displays the information schema views - Displays column information (database type, clr type, max length, allows null, etc) - Support...CODE Framework: 4.0.30618.0: See change notes in the documentation section for details on what's new. Note: If you download the class reference help file with, you have to right-click the file, pick "Properties", and then unblock the file, as many browsers flag the file as blocked during download (for security reasons) and thus hides all content.Toolbox for Dynamics CRM 2011: XrmToolBox (v1.2013.6.18): XrmToolbox improvement Use new connection controls (use of Microsoft.Xrm.Client.dll) New display capabilities for tools (size, image and colors) Added prerequisites check Added Most Used Tools feature Tools improvementNew toolSolution Transfer Tool (v1.0.0.0) developed by DamSim Updated toolView Layout Replicator (v1.2013.6.17) Double click on source view to display its layoutXml All tools list Access Checker (v1.2013.6.17) Attribute Bulk Updater (v1.2013.6.18) FetchXml Tester (v1.2013.6.1...Media Companion: Media Companion MC3.570b: New* Movie - using XBMC TMDB - now renames movies if option selected. * Movie - using Xbmc Tmdb - Actor images saved from TMDb if option selected. Fixed* Movie - Checks for poster.jpg against missing poster filter * Movie - Fixed continual scraping of vob movie file (not DVD structure) * Both - Correctly display audio channels * Both - Correctly populate audio info in nfo's if multiple audio tracks. * Both - added icons and checked for DTS ES and Dolby TrueHD audio tracks. * Both - Stream d...Document.Editor: 2013.24: What's new for Document.Editor 2013.24: Improved Video Editing support Improved Link Editing support Minor Bug Fix's, improvements and speed upsExtJS based ASP.NET Controls: FineUI v3.3.0: ??FineUI ?? ExtJS ??? ASP.NET ???。 FineUI??? ?? No JavaScript,No CSS,No UpdatePanel,No ViewState,No WebServices ???????。 ?????? IE 7.0、Firefox 3.6、Chrome 3.0、Opera 10.5、Safari 3.0+ ???? Apache License v2.0 ?:ExtJS ?? GPL v3 ?????(http://www.sencha.com/license)。 ???? ??:http://fineui.com/bbs/ ??:http://fineui.com/demo/ ??:http://fineui.com/doc/ ??:http://fineui.codeplex.com/ FineUI???? ExtJS ?????????,???? ExtJS ?。 ????? FineUI ? ExtJS ?:http://fineui.com/bbs/forum.php?mod=viewthrea...BarbaTunnel: BarbaTunnel 8.0: Check Version History for more information about this release.ExpressProfiler: ExpressProfiler v1.5: [+] added Start time, End time event columns [+] added SP:StmtStarting, SP:StmtCompleted events [*] fixed bug with Audit:Logout eventpatterns & practices: Data Access Guidance: Data Access Guidance Drop4 2013.06.17: Drop 4New ProjectsActivity Injector: Activity Injector is mainly used in unit testing workflow activities.Application Starter Tremens: Application for starting ApplicationaAudiwikiREF: This project is created to finish my referral assignment because i faced many problems with the previous one where i couldn't connect the project and commit. AVTS GLUCK ANTIVIRUS SYSTEM: AVTS Gluck AntiVirus System - the best antivirus made by young developers from Ukraine. It was writed in C#, C++, VB.NET, Excutable files, PHP DS, .and more....Base64 File Converter: File converting tool that converts any file into Base64 TXT file and vice versa just by drag-and-drop gesture.Basket Builders Umbraco bootstrap packages: Our umbraco packages to speed up project Bugzy Bug Tracking and Work Management: Creating Bug Tracking and Work Managment system while helping the JSON-RPC standard become as strong as the Bloated XML-RPC(SOAP)Dalmatian Build Script: Dalmatian allows you to automate your build using C# of VB.NETDécouverte Majeure MISN Chrono-courses: Chronocourses projectDocToolTip Library: DocToolTip library allows creating screenshots of winform forms with the name and type of component controls.DotaPick: SummaryHiUpdateTools - easy publish and update your app: HiUpdateTools is a easy tools to use publish new version of your application Kinect Skeleton Recorder: Records and shows XBox 360 Kinect skeleton data using windows forms. Saves the data as XML that can be read by other applications for use in animation.MARIT (BETA): MARIT is a simple drawing app for Android. More features are on the todo-list so stay tuned! Feedback and requests are appreciated. MvcToolbox: MVC Toolbox is a library which contains lots of usefull methods which will help developer to write less code and to be more productive.PrototypeRef: list of testing toolsSmartPlay: SmartPlaySolid Geometry Simulation: Implements a simple fur/hair dynamic engine using physBAM. This package includes a semi-implicit solver for hair simulation and plugins for maya/houdini.Windows.Forms.Controls Revisited (SSTA.WinForms): SearchableListBox control built on the Window,Form.ListBox renders a list box items with multiple search terms highlighted.YiLeSystem: This is a dining system.

    Read the article

  • SQLAuthority News – BI Quiz Question – How to Optimize Cube? – Hints

    - by pinaldave
    I earlier wrote about SQL BI Quiz over here. The details of the quiz is as following: Working with huge data is very common when it is about Data Warehousing. It is necessary to create Cubes on the data to make it meaningful and consumable. There are cases when retrieving the data from cube takes lots of the time. Let us assume that your cube is returning you data very quickly. Suddenly on one day it is returning the data very slowly. What are the three things will you to diagnose this. After diagnose what you will do to resolve performance issue. Participate in my question over here Here is a couple of hints what I am looking for in answer: How to reach to root of slow performance? Is hardware causing the problem or something else? Is slowness is due to how cube is build and its granularity? Is underlying tables require maintenance? Is there is chance to refractor the process? Are there any tool which can help diagnosis the slowness of the cube? It is not necessary to answer all the question – but something to start with. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Puzzle, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Organising data access for dependency injection

    - by IanAWP
    In our company we have a relatively long history of database backed applications, but have only just begun experimenting with dependency injection. I am looking for advice about how to convert our existing data access pattern into one more suited for dependency injection. Some specific questions: Do you create one access object per table (Given that a table represents an entity collection)? One interface per table? All of these would need the low level Data Access object to be injected, right? What about if there are dozens of tables, wouldn't that make the composition root into a nightmare? Would you instead have a single interface that defines things like GetCustomer(), GetOrder(), etc? If I took the example of EntityFramework, then I would have one Container that exposes an object for each table, but that container doesn't conform to any interface itself, so doesn't seem like it's compatible with DI. What we do now, in case it helps: The way we normally manage data access is through a generic data layer which exposes CRUD/Transaction capabilities and has provider specific subclasses which handle the creation of IDbConnection, IDbCommand, etc. Actual table access uses Table classes that perform the CRUD operations associated with a particular table and accept/return domain objects that the rest of the system deals with. These table classes expose only static methods, and utilise a static DataAccess singleton instantiated from a config file.

    Read the article

  • Cloud computing - database loading question

    - by workwise
    Following is the situation, I want to know whether what I want is possible in cloud computing and is it the best way for me: 1) My main site has a Database with tables with millions of rows, and entries are added almost every second. 2) I will setup a mysql mirror, so there will be a backup database always in sync with the main one. 3) There are few tens of thousands of images- growing. So say total size of images few tens of gigabytes. I will be keeping the image data also in sync on the backup server. 4) There can be short periods where traffic can go 100X the average traffic. 5) I will be using memcache heavily - most database and even frequently used disk files/images will be in RAM. I want that the main site runs on a dedicated server. The backup server is say an Amazon EC2 instance. Now note that since it is live backup, I need to run a small instance continuously. I want that when I anticipate high traffic, I should be able to run a large instance on the cloud and transfer the traffic there. The main point is - I do not want to spend time in "loading" the database on the large instance, as it typically can take few minutes or even hours (experience). So is it possible to just scale the memory/CPU on demand, and not having to load the database or sync up the filesystem? I want to setup my backup scripts etc just ONCE. Thanks JP

    Read the article

  • Databases and the CI server

    - by mlk
    I have a CI server (Hudson) which merrily builds, runs unit tests and deploys to the development environment but I'd now like to get it running the integration tests. The integration tests will hit a database and that database will be consistently being changed to contain the data relevant to the test in question. This however leads to a problem - how do I make sure the database is not being splatted with data for one test and then that data being override by a second project before the first set of tests complete? I am current using the "hope" method, which is not working out too badly at the moment, but mostly due to the fact that we only have a small number of integration tests set up on CI. As I see it I have the following options: Test-local (in memory) databases I'm not sure if any in-memory databases handle all the scaryness of Oracles triggers and packages etc, and anything less I don't feel would be a worth while test. CI Executor-local databasesA fair amount of work would be needed to set this up and keep 'em up to date, but defiantly an option (most of the work is already done to keep the current CI database up-to-date). Single "integration test" executorLikely the easiest to implement, but would mean the integration tests could fall quite far behind. Locking the database (or set of tables) I'm sure I've missed some ways (please add them). How do you run database-based integration tests on the CI server? What issues have you had and what method do you recommend? (Note: While I use Hudson, I'm happy to accept answers for any CI server, the ideas I'm sure will be portable, even if the details are not). Cheers,      Mlk

    Read the article

< Previous Page | 195 196 197 198 199 200 201 202 203 204 205 206  | Next Page >