Search Results

Search found 36419 results on 1457 pages for 'sql extended events'.

Page 59/1457 | < Previous Page | 55 56 57 58 59 60 61 62 63 64 65 66  | Next Page >

  • SQL Server: String Manipulation, Unpivoting

    - by OMG Ponies
    I have a column called body, which contains body content for our CMS. The data looks like: ...{cloak:id=1.1.1}...{cloak}...{cloak:id=1.1.2}...{cloak}...{cloak:id=1.1.3}...{cloak}... A moderately tweaked for readability example: ## h5. A formal process for approving and testing all external network connections and changes to the firewall and router configurations? {toggle-cloak:id=1.1.1}{tree-plus-icon} *Compliance:* {color:red}{*}Partial{*}{color} (?) {cloak:id=1.1.1} || Date: | 2010-03-15 || || Owner: | Brian || || Researched by: | || || Narrative: | Jira tickets are normally used to approve and track network changes\\ || || Artifacts: | Jira.bccampus.ca\\ || || Recommendation: | Need to update policy that no Jira = no change\\ || || Proposed Remedy(ies): | || || Approved Remedy(ies): | || || Date: | || || Reviewed by: | || || Remarks/comments: | || {cloak}## h5. Current network diagrams with all connections to cardholder data, including any wireless networks? {toggle-cloak:id=1.1.2}{tree-plus-icon} *Compliance:* {color:red}{*}TBD{*}{color} (?) {cloak:id=1.1.2} I'd like to get the cloak values out in the following format: requirement_num ----------------- 1.1.1 1.1.2 1.1.3 I'm looking at using UNIONs - does anyone have a better recommendation? Forgot to mention: I can't use regex, because CLR isn't enabled on the database. The numbers aren't sequencial. The current record jumps from 1.1.6 to 1.2.1

    Read the article

  • help with t-sql data aggregation

    - by stackoverflowuser
    Based on the following table Area S1 S2 S3 S4 -------------------- A1 5 10 20 0 A2 11 19 15 20 A3 0 0 0 20 I want to generate an output that will give the number of columns not having "0". So the output would be Area S1 S2 S3 S4 Count ------------------------- A1 5 10 20 0 3 A2 11 19 15 20 4 A3 0 0 0 20 1

    Read the article

  • t-sql getting leaf nodes

    - by stackoverflowuser
    Based on following table (I have kept spaces between the rows for clarity) Path ----------- \node1\node2\node3 \node1\node2\node3\node5 \node1\node6\node3 \node1\node4\node3 \node1\node4\node3\node7 \node1\node4\node3\node8 \node1\node4\node3\node9 \node1\node4\node3\node9\node10 I want to get all the paths containing leaf node. So for instance, following will be considered leaf nodes for path \node1\node4\node3 \node1\node4\node3\node7 \node1\node4\node3\node8 \node1\node4\node3\node9\node10 The following will be the output: Output --------------------------- \node1\node2\node3\node5 \node1\node6\node3 \node1\node4\node3\node7 \node1\node4\node3\node8 \node1\node4\node3\node9\node10 Pls. suggest. Thanks.

    Read the article

  • How to realize this in Transact SQL ( SQL Server 2008 )

    - by David
    I just want an update trigger like this postgresql version... It seems to me there is no NEW. and OLD. in MSSQL? CREATE OR REPLACE FUNCTION "public"."ts_create" () RETURNS trigger AS DECLARE BEGIN NEW.ctime := now(); RETURN NEW; END; Googled already, but nothing to be found unfortunately... Ideas?

    Read the article

  • Introducing Microsoft SQL Server 2008 R2 - Business Intelligence Samples

    - by smisner
    On April 14, 2010, Microsoft Press (blog | twitter) released my latest book, co-authored with Ross Mistry (twitter), as a free ebook download - Introducing Microsoft SQL Server 2008 R2. As the title implies, this ebook is an introduction to the latest SQL Server release. Although you'll find a comprehensive review of the product's features in this book, you will not find the step-by-step details that are typical in my other books. For those readers who are interested in a more interactive learning experience, I have created two samples file for download: IntroSQLServer2008R2Samples project Sales Analysis workbook Here's a recap of the business intelligence chapters and the samples I used to generate the screen shots by chapter: Chapter 6: Scalable Data Warehousing covers a new edition of SQL Server, Parallel Data Warehouse. Understandably, Microsoft did not ship me the software and hardware to set up my own Parallel Data Warehouse environment for testing purposes and consequently you won't see any screenshots in this chapter. I received a lot of information and a lot of help from the product team during the development of this chapter to ensure its technical accuracy. Chapter 7: Master Data Services is a new component in SQL Server. After you install Master Data Services (MDS), which is a separate installation from SQL Server although it's found on the same media, you can install sample models to explore (which is what I did to create screenshots for the book). To do this, you deploying packages found at \Program Files\Microsoft SQL Server\Master Data Services\Samples\Packages. You will first need to use the Configuration Manager (in the Microsoft SQL Server 2008 R2\Master Data Services program group) to create a database and a Web application for MDS. Then when you launch the application, you'll see a Getting Started page which has a Deploy Sample Data link that you can use to deploy any of the sample packages. Chapter 8: Complex Event Processing is an introduction to another new component, StreamInsight. This topic was way too large to cover in-depth in a single chapter, so I focused on information such as architecture, development models, and an overview of the key sections of code you'll need to develop for your own applications. StreamInsight is an engine that operates on data in-flight and as such has no user interface that I could include in the book as screenshots. The November CTP version of SQL Server 2008 R2 included code samples as part of the installation, but these are not the official samples that will eventually be available in Codeplex. At the time of this writing, the samples are not yet published. Chapter 9: Reporting Services Enhancements provides an overview of all the changes to Reporting Services in SQL Server 2008 R2, and there are many! In previous posts, I shared more details than you'll find in the book about new functions (Lookup, MultiLookup, and LookupSet), properties for page numbering, and the new global variable RenderFormat. I will confess that I didn't use actual data in the book for my discussion on the Lookup functions, but I did create real reports for the blog posts and will upload those separately. For the other screenshots and examples in the book, I have created the IntroSQLServer2008R2Samples project for you to download. To preview these reports in Business Intelligence Development Studio, you must have the AdventureWorksDW2008R2 database installed, and you must download and install SQL Server 2008 R2. For the map report, you must execute the PopulationData.sql script that I included in the samples file to add a table to the AdventureWorksDW2008R2 database. The IntroSQLServer2008R2Samples project includes the following files: 01_AggregateOfAggregates.rdl to illustrate the use of embedded aggregate functions 02_RenderFormatAndPaging.rdl to illustrate the use of page break properties (Disabled, ResetPageNumber), the PageName property, and the RenderFormat global variable 03_DataSynchronization.rdl to illustrate the use of the DomainScope property 04_TextboxOrientation.rdl to illustrate the use of the WritingMode property 05_DataBar.rdl 06_Sparklines.rdl 07_Indicators.rdl 08_Map.rdl to illustrate a simple analytical map that uses color to show population counts by state PopulationData.sql to provide the data necessary for the map report Chapter 10: Self-Service Analysis with PowerPivot introduces two new components to the Microsoft BI stack, PowerPivot for Excel and PowerPivot for SharePoint, which you can learn more about at the PowerPivot site. To produce the screenshots for this chapter, I created the Sales Analysis workbook which you can download (although you must have Excel 2010 and the PowerPivot for Excel add-in installed to explore it fully). It's a rather simple workbook because space in the book did not permit a complete exploration of all the wonderful things you can do with PowerPivot. I used a tutorial that was available with the CTP version as a basis for the report so it might look familiar if you've already started learning about PowerPivot. In future posts, I'll continue exploring the new features in greater detail. If there's any special requests, please let me know! Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Support for 15,000 Partitions in SQL Server 2008 SP2 and SQL Server 2008 R2 SP1

    In SQL Server 2008 and SQL Server 2008 R2, the number of partitions on tables and indexes is limited to 1,000. This paper discusses how SQL Server 2008 SP2 and SQL Server 2008 R2 SP1 address this limitation by providing an option to increase the limit to 15,000 partitions. It describes how support for 15,000 partitions can be enabled and disabled on a database. It also talks about performance characteristics, certain limitations associated with this support, known issues, and their workarounds. This support is targeted to enterprise customers and ISVs with large-scale decision support or data warehouse requirements. The Future of SQL Server MonitoringMonitor wherever, whenever with Red Gate's SQL Monitor. See it live in action now.

    Read the article

  • Data-tier Applications in SQL Server 2008 R2

    - by BuckWoody
    I had the privilege of presenting to the Adelaide SQL Server User Group in Australia last evening, and I covered the Data Access Component (DAC) and the Utility Control Point (UCP) from SQL Server 2008 R2. Here are some links from that presentation:   Whitepaper: http://msdn.microsoft.com/en-us/library/ff381683.aspx Tutorials: http://msdn.microsoft.com/en-us/library/ee210554(SQL.105).aspx From Visual Studio: http://msdn.microsoft.com/en-us/library/dd193245(VS.100).aspx Restrictions and capabilities by Edition: http://msdn.microsoft.com/en-us/library/cc645993(SQL.105).aspx    Glen Berry's Blog entry on scripts for UCP/DAC: http://www.sqlservercentral.com/blogs/glennberry/archive/2010/05/19/sql-server-utility-script-from-24-hours-of-pass.aspx    Objects supported by a DAC: http://msdn.microsoft.com/en-us/library/ee210549(SQL.105).aspx   Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Approaching events #mstc11 #ppws #sqlbits

    - by Marco Russo (SQLBI)
    The spring season is always full of events and I’m just preparing for a number of them. First of all, we are getting very good interest for the PowerPivot Workshop in Copenhagen on 21-22 March 2011. Tomorrow (Friday March 4) will be the last day to take advantage of the Early Bird rate for this date. We will also participate to an evening meeting of local user groups on March 21 in Copenhagen, more news about this in the next few days. Other scheduled dates are in Dublin (28-29 March 2011) and in...(read more)

    Read the article

  • Data-tier Applications in SQL Server 2008 R2

    - by BuckWoody
    I had the privilege of presenting to the Adelaide SQL Server User Group in Australia last evening, and I covered the Data Access Component (DAC) and the Utility Control Point (UCP) from SQL Server 2008 R2. Here are some links from that presentation:   Whitepaper: http://msdn.microsoft.com/en-us/library/ff381683.aspx Tutorials: http://msdn.microsoft.com/en-us/library/ee210554(SQL.105).aspx From Visual Studio: http://msdn.microsoft.com/en-us/library/dd193245(VS.100).aspx Restrictions and capabilities by Edition: http://msdn.microsoft.com/en-us/library/cc645993(SQL.105).aspx    Glen Berry's Blog entry on scripts for UCP/DAC: http://www.sqlservercentral.com/blogs/glennberry/archive/2010/05/19/sql-server-utility-script-from-24-hours-of-pass.aspx    Objects supported by a DAC: http://msdn.microsoft.com/en-us/library/ee210549(SQL.105).aspx   Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • HTG Explains: Should You Buy Extended Warranties?

    - by Chris Hoffman
    Buy something at an electronics store and you’ll be confronted by a pushy salesperson who insists you need an extended warranty. You’ll also see extended warranties pushed hard when shopping online. But are they worth it? There’s a reason stores push extended warranties so hard. They’re almost always pure profit for the store involved. An electronics store may live on razor-thin product margins and make big profits on extended warranties and overpriced HDMI cables. You’re Already Getting Multiple Warranties First, back up. The product you’re buying already includes a warranty. In fact, you’re probably getting several different types of warranties. Store Return and Exchange: Most electronics stores allow you to return a malfunctioning product within the first 15 or 30 days and they’ll provide you with a new one. The exact period of time will vary from store to store. If you walk out of the store with a defective product and have to swap it for a new one within the first few weeks, this should be easy. Manufacturer Warranty: A device’s manufacturer — whether the device is a laptop, a television, or a graphics card — offers their own warranty period. The manufacturer warranty covers you after the store refuses to take the product back and exchange it. The length of this warranty depends on the type of product. For example, a cheap laptop may only offer a one-year manufacturer warranty, while a more expensive laptop may offer a two-year warranty. Credit Card Warranty Extension: Many credit cards offer free extended warranties on products you buy with that credit card. Credit card companies will often give you an additional year of warranty. For example, if you buy a laptop with a two year warranty and it fails in the third year, you could then contact your credit card company and they’d cover the cost of fixing or replacing it. Check your credit card’s benefits and fine print for more information. Why Extended Warranties Are Bad You’re already getting a fairly long warranty period, especially if you have a credit card that offers you a free extended warranty — these are fairly common. If the product you get is a “lemon” and has a manufacturing error, it will likely fail pretty soon — well within your warranty period. The extended warranty matters after all your other warranties are exhausted. In the case of a laptop with a two-year warranty that you purchase with a credit card giving you a one-year warranty extension, your extended warranty will kick in three years after you purchase the laptop. In that many years, your current laptop will likely feel pretty old and laptops that are as good — or better — will likely be pretty cheap. If it’s a television, better television displays will be available at a lower price point. You’ll either want to upgrade to a newer model or you’ll be able to buy a new, just-as-good product for very cheap. You’ll only have to pay out-of-pocket if your device fails after the normal warranty period — in over two or three years for typical laptops purchased with a decent credit card. Save the money you would have spent on the warranty and put it towards a future upgrade. How Much Do Extended Warranties Cost? Let’s look at an example from a typical pushy retail outlet, Best Buy. We went to Best Buy’s website and found a pretty standard $600 Samsung laptop. This laptop comes with a one-year warranty period. If purchased with a fairly common credit card, you can easily get a two-year warranty period on this laptop without spending an additional penny. (Yes, such credit cards are available with no yearly fees.) During the check-out process, Best Buy tries to sell you a Geek Squad “Accidental Protection Plan.” To get an additional year of Best Buy’s extended warranty, you’d have to pay $324.98 for a “3-Year Accidental Protection Plan”. You’d basically be paying more than half the price of your laptop for an additional year of warranty — remember, the standard warranties would cover you anyway for the first two years. If this laptop did break sometime between two and three years from now, we wouldn’t be surprised if you could purchase a comparable laptop for about $325 anyway. And, if you don’t need to replace it, you’ve saved that money. Best Buy would object that this isn’t a standard extended warranty. It’s a supercharged warranty plan that will also provide coverage if you spill something on your laptop or drop it and break it. You just have to ask yourself a question. What are the odds that you’ll drop your laptop or spill something on it? They’re probably pretty low if you’re a typical human being. Is it worth spending more than half the price of the laptop just in case you’ll make an uncommon mistake? Probably not. There may be occasional exceptions to this — some Apple users swear by Apple’s AppleCare, for example — but you should generally avoid buying these things. There’s a reason stores are so pushy about extended warranties, and it’s not because they want to help protect you. It’s because they’re making lots of profit from these plans, and they’re making so much profit because they’re not a good deal for customers. Image Credit: Philip Taylor on Flickr     

    Read the article

  • SQL Developer at Oracle Open World 2012

    - by thatjeffsmith
    We have a lot going on in San Francisco this fall. One of the most personal exciting bits, for what will be my 4th or 5th Open World, is that this will be my FIRST as a member of Team Oracle. I’ve presented once before, but most years it was just me pressing flesh at the vendor booths. After 3-4 days of standing and talking, you’re ready to just go home and not do anything for a few weeks. This time I’ll have a chance to walk around and talk with our users and get a good idea of what’s working and what’s not. Of course it will be a great opportunity for you to find us and get to know your SQL Developer team! 3.4 miles across and back – thanks Ashley for signing me up for the run! This year is going to be a bit crazy. Work wise I’ll be presenting twice, working a booth, and proctoring several of our Hands-On Labs. The fun parts will be equally crazy though – running across the Bay Bridge (I don’t run), swimming the Bay (I don’t swim), having my wife fly out on Wednesday for the concert, and then our first WhiskyFest on Friday (I do drink whisky though.) But back to work – let’s talk about EVERYTHING you can expect from the SQL Developer team. Booth Hours We’ll have 2 ‘demo pods’ in the Exhibition Hall over at Moscone South. Look for the farm of Oracle booths, we’ll be there under the signs that say ‘SQL Developer.’ There will be several people on hand, mostly developers (yes, they still count as people), who can answer your questions or demo the latest features. Come by and say ‘Hi!’, and let us know what you like and what you think we can do better. Seriously. Monday 10AM – 6PM Tuesday 9:45AM – 6PM Wednesday 9:45AM – 4PM Presentations Stop by for an hour, pull up a chair, sit back and soak in all the SQL Developer goodness. You’ll only have to suffer my bad jokes for two of the presentations, so please at least try to come to the other ones. We’ll be talking about data modeling, migrations, source control, and new features in versions 3.1 and 3.2 of SQL Developer and SQL Developer Data Modeler. Day Time Event Monday 10:454:45 What’s New in SQL Developer Why Move to Oracle Application Express Listener Tueday 10:1511:455:00 Using Subversion in Oracle SQL Developer Data Modeler Oracle SQL Developer Tips & Tricks Database Design with Oracle SQL Developer Data Modeler Wednesday 11:453:30 Migrating Third-Party Databases and Applications to Oracle Exadata 11g Enterprise Options and Management Packs for Developers Hands On Labs (HOLs) The Hands On Labs allow you to come into a classroom environment, sit down at a computer, and run through some exercises. We’ll provide the hardware, software, and training materials. It’s self-paced, but we’ll have several helpers walking around to answer questions and chat up any SQL Developer or database topic that comes to mind. If your employer is sending you to Open World for all that great training, the HOLs are a great opportunity to capitalize on that. They are only 60 minutes each, so you don’t have to worry about burning out. And there’s no homework! Of course, if you do want to take the labs home with you, many are already available via the Developer Day Hands-On Database Applications Developer Lab. You will need your own computer for those, but we’ll take care of the rest. Wednesday PL/SQL Development and Unit Testing with Oracle SQL Developer 10:15 Performance Tuning with Oracle SQL Developer 11:45 Thursday The Soup to Nuts of Data Modeling with Oracle SQL Developer Data Modeler 11:15 Some Parting Advice Always wanted to meet your favorite Oracle authors, speakers, and thought-leaders? Don’t be shy, walk right up to them and introduce yourself. Normal social rules still apply, but at the conference everyone is open and up for meeting and talking with attendees. Just understand if there’s a line that you might only get a minute or two. It’s a LONG conference though, so you’ll have plenty of time to catch up with everyone. If you’re going to be around on Tuesday evening, head on over to the OTN Lounge from 4:30 to 6:30 and hang out for our Tweet Meet. That’s right, all the Oracle nerds on Twitter will be there in one place. Be sure to put your Twitter handle on your name tag so we know who you are!

    Read the article

  • How to Automate your Database Documentation

    - by Jonathan Hickford
    In my previous post, “Automating Deployments with SQL Compare command line” I looked at how teams can automate the deployment and post deployment validation of SQL Server databases using the command line versions of Red Gate tools. In this post I’m looking at another use for the command line tools, namely using them to generate up-to-date documentation with every database change. There are many reasons why up-to-date documentation is valuable. For example when somebody new has to work on or administer a database for the first time, or when a new database comes into service. Having database documentation reduces the risks of making incorrect decisions when making changes. Documentation is very useful to business intelligence analysts when writing reports, for example in SSRS. There are a couple of great examples talking about why up to date documentation is valuable on this site:  Database Documentation – Lands of Trolls: Why and How? and Database Documentation Using SQL Doc. The short answer is that it can save you time and reduce risk when you need that most! SQL Doc is a fast simple tool that automatically generates database documentation. It can create documents in HTML, Word or pdf files. The documentation contains information about object definitions and dependencies, along with any other information you want to associate with each object. The SQL Doc GUI, which is included in Red Gate’s SQL Developer Bundle and SQL Toolbelt, allows you to add additional notes to objects, and customise which objects are shown in the docs.  These settings can be saved as a .sqldoc project file. The SQL Doc command line can use this project file to automatically update the documentation every time the database is changed, ensuring that documentation that is always up to date. The simplest way to keep documentation up to date is probably to use a scheduled task to run a script every day. However if you have a source controlled database, or are using a Continuous Integration (CI) server or a build server, it may make more sense to use that instead. If  you’re using SQL Source Control or SSDT Database Projects to help version control your database, you can automatically update the documentation after each change is made to the source control repository that contains your database. To get this automation in place,  you can use the functionality of a Continuous Integration (CI) server, which can trigger commands to run when a source control repository has changed. A CI server will also capture and save the documentation that is created as an artifact, so you can always find the exact documentation for a specific version of the database. This forms an always up to date data dictionary. If you don’t already have a CI server in place there are several you can use, such as the free open source Jenkins or the free starter editions of TeamCity. I won’t cover setting these up in this article, but there is information about using CI servers for automating database tasks on the Red Gate Database Delivery webpage. You may be interested in Red Gate’s SQL CI utility (part of the SQL Automation Pack) which is an easy way to update a database with the latest changes from source control. The PowerShell example below shows how to create the documentation from a database. That database might be your integration database or a shared development database that is always up to date with the latest changes. $serverName = "server\instance" $databaseName = "databaseName" # If you want to document multiple databases use a comma separated list $userName = "username" $password = "password" # Path to SQLDoc.exe $SQLDocPath = "C:\Program Files (x86)\Red Gate\SQL Doc 3\SQLDoc.exe" $arguments = @( "/server:$($serverName)", "/database:$($databaseName)", "/username:$($userName)", "/password:$($password)", "/filetype:html", "/outputfolder:.", # "/project:$args[0]", # If you already have a .sqldoc project file you can pass it as an argument to this script. Values in the project will be overridden with any options set on the command line "/name:$databaseName Report", "/copyrightauthor:$([Environment]::UserName)" ) write-host $arguments & $SQLDocPath $arguments There are several options you can set on the command line to vary how your documentation is created. For example, you can document multiple databases or exclude certain types of objects. In the example above, we set the name of the report to match the database name, and use the current Windows user as the documentation author. For more examples of how you can customise the report from the command line please see the SQL Doc command line documentation If you already have a .sqldoc project file, or wish to further customise the report by including or excluding specific objects, you can use this project on the command line. Any settings you specify on the command line will override the defaults in the project. For details of what you can customise in the project please see the SQL Doc project documentation. In the example above, the line to use a project is commented out, but you can uncomment this line and then pass a path to a .sqldoc project file as an argument to this script.  Conclusion Keeping documentation about your databases up to date is very easy to set up using SQL Doc and PowerShell. By using a CI server to run this process you can trigger the documentation to be run on every change to a source controlled database, and keep historic documentation available. If you are considering more advanced database automation, e.g. database unit testing, change script generation, deploying to large numbers of targets and backup/verification, please email me at [email protected] for further script samples or if you have any questions.

    Read the article

  • Best practice to query data from MS SQL Server in C Sharp?

    - by Bruno
    What is the best way to query data from a MS SQL Server in C Sharp? I know that it is not good practice to have an SQL query in the code. Is the best way to create a stored procedure and call it from C Sharp with parameters? using (var conn = new SqlConnection(connStr)) using (var command = new SqlCommand("StoredProc", conn) { CommandType = CommandType.StoredProcedure }) { conn.Open(); command.ExecuteNonQuery(); conn.Close(); }

    Read the article

  • Idea to develop a caching server between IIS and SQL Server

    - by John
    I work on a few high traffic websites that all share the same database and that are all heavily database driven. Our SQL server is max-ed out and, although we have already implemented many changes that have helped but the server is still working too hard. We employ some caching in our website but the type of queries we use negate using SQL dependency caching. We tried SQL replication to try and kind of load balance but that didn't prove very successful because the replication process is quite demanding on the servers too and it needed to be done frequently as it is important that data is up to date. We do use a Varnish web caching server (Linux based) to take a bit of the load off both the web and database server but as a lot of the sites are customised based on the user we can only do so much. Anyway, the reason for this question... Varnish gave me an idea for a possible application that might help in this situation. Just like Varnish sits between a web browser and the web server and caches response from the web server, I was wondering about the possibility of creating something that sits between the web server and the database server. Imagine that all SQL queries go through this SQL caching server. If it's a first time query then it will get recorded, and the result requested from the SQL server and stored locally on the cache server. If it's a repeat request within a set time then the result gets retrieved from the local copy without the query being sent to the SQL server. The caching server could also take advantage of SQL dependency caching notifications. This seems like a good idea in theory. There's still the same amount of data moving back and forward from the web server, but the SQL server is relieved of the work of processing the repeat queries. I wonder about how difficult it would be to build a service that sort of emulates requests and responses from SQL server, whether SQL server's own caching is doing enough of this already that this wouldn't be a benefit, or even if someone has done this before and I haven't found it? I would welcome any feedback or any references to any relevant projects.

    Read the article

  • cocos2d-x and handling touch events

    - by Jason
    I have my sprites on screen and I have a vector that stores each sprite. Can a CCSprite* handle a touch event? Or just the CCLayer*? What is the best way to decide what sprite was touched? Should I store the coordinates of where the sprite is (in the sprite class) and when I get the event, see if where the user touched is where the sprite is by looking through the vector and getting each sprites current coordinates? UPDATE: I subclass CCSprite: class Field : public cocos2d::CCSprite, public cocos2d::CCTargetedTouchDelegate and I implement functions: cocos2d::CCRect rect(); virtual void onEnter(); virtual void onExit(); bool containsTouchLocation(cocos2d::CCTouch* touch); virtual bool ccTouchBegan(cocos2d::CCTouch* touch, cocos2d::CCEvent* event); virtual void ccTouchMoved(cocos2d::CCTouch* touch, cocos2d::CCEvent* event); virtual void ccTouchEnded(cocos2d::CCTouch* touch, cocos2d::CCEvent* event); virtual void touchDelegateRetain(); virtual void touchDelegateRelease(); I put CCLOG statements in each one and I dont hit them! When I touch the CCLayer this sprite is on though I do hit those in the class that implements the Layer and puts these sprites on the layer.

    Read the article

  • Upcoming UPK Events

    - by kathryn.lustenberger(at)oracle.com
    February 15th: UPK: Follow Panduit's Lead and Leverage Oracle's User Productivity Kit To Achieve Your Goals - Join us for a live webcast to learn how Oracle's User Productivity Kit can help you meet and exceed your goals. The webcast will feature Jim Boss, from the Panduit Corporation, who will share how Oracle's User Productivity Kit was used with both Oracle and Non-Oracle applications to helped Panduit to meet their goals. Date: February 15th, 2011 at 12:00 PST / 3:00 EST Evite: http://www.oracle.com/us/dm/65630-naod10046029mpp005c010-se-300908.html March 2nd: Synaptis teams with Oracle to deliver a UPK customer success story - Webinar Offering The Value of UPK (Customer Success Story): How to leverage the value of UPK to streamline processes and maximize end user adoption for a global implementation Join us to learn how the power of UPK can be leveraged to train end users globally in a successful and cost effective manner. A valued Oracle UPK customer will share experiences, successes, challenges, and strategies. The webinar will also include a question and answer session to give the attendees an opportunity to interact directly with the Oracle UPK customer, Synaptis, and the Oracle UPK Team. Date: March 2, 2011 Time: 11:00am - 12:00pm EST Register for this webinar March 27 - 30th: The Alliance 2011 conference is an annual event for all higher education, government, and public sector users of Oracle applications. The Alliance conference is organized and managed by the Higher Education User Group (www.heug.org). This is the 14th annual event for the HEUG. This is your opportunity to join with over 3200 other Higher Education, Federal, State and Local Government users to network, learn and share in our amazing combined experiences. The Alliance conference team is hard at work, putting together the best conference ever for 2011 - so don't delay, make your plans now to be part of Alliance 2011! When: Sunday, March 27th, 2011 - Wednesday, March 30, 2011 Where: The Colorado Convention Center (Denver, Colorado) Registration for Alliance 2011 is Now Open! UPK will be represented at this event offering: Pre-Conference Training Learn the Basics of Oracle User Productivity Kit (UPK) Taking Your UPKs to a Whole New Level, Advanced Use of UPK Demo Pod Staff Sessions: Oracle User Productivity Kit: Creating Value throughout the Project Lifecycle Beyond Basic UPK -- User Tracking and SmartHelp Leveraging Oracle and User Productivity Kit (UPK) to Develop a Comprehensive Training Program Oracle User Productivity Kit Strategy and Roadmap -- Key to User Adoption April 10 - 14th: Registration for COLLABORATE 11 has begun - Don't miss the most comprehensive, user-driven conference devoted to Oracle applications and technology. Collaborate with a global network of more than 5,000 peers and experts to share real-world experiences, solve your challenges and gain insights to validate your technology plans. Read below to discover which group to register with for the best value. UPK will be represented at this event offering: Demo Pod Staff Sessions: Oracle User Productivity Kit: Creating Value throughout the Project Lifecycle Centralize all Project Team assets, AND, Deploy Fully Measurable Training with UPK Pro Oracle User Productivity Kit Strategy and Roadmap - Key to User Adoption Registration is Now Open!

    Read the article

  • SQL Server for the Oracle DBA Links

    - by BuckWoody
    I do a presentation (and a class) called "SQL Server for the Oracle DBA". It's a non-marketing overview that gives you the basics of working with SQL Server if you're already familiar wtih how Oracle works. This class and these links DO NOT help you with "Why should I use Oracle/SQL Server instead of Oracle/SQL Server" - I'll assume you're already there, and if not, there are LOTS of sites to help you make that decision. Although these links might contain slight marketing slants (I don't control them) I've tried to get the best links I can. Feel free to comment here to add more/better links. As such, these aren't links that help you work with Oracle - they are links to help you work with SQL Server. Some of them contain more information than you actually need, others don't have near enough. Taken together (and with the class) you're able to get done what you need to do. "Practical SQL Server for Oracle Professionals" - A Microsoft Whitepaper, probably the best place to get started: http://download.microsoft.com/download/6/9/d/69d1fea7-5b42-437a-b3ba-a4ad13e34ef6/SQLServer2008forOracle.docx Free Training: http://technet.microsoft.com/en-us/sqlserver/dd548020.aspx Classroom training (will cost you): http://www.microsoft.com/learning/en/us/course.aspx?ID=50068A&locale=en-us Terminology Differences: http://www.associatedcontent.com/article/2383466/oracle_and_sql_server_basic_terminology.html Datatype mapping between Oracle and SQL Server: http://msdn.microsoft.com/en-us/library/ms151817.aspx The "other" direction - can still be useful for the Oracle professional to see the other side: http://blog.benday.com/archive/2008/10/23/23195.aspx Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Database Security Events in April

    - by Troy Kitch
    Wed, Apr 18, Executive Oracle Database Security Round Table - Tampa, FL Tue, Apr 24, ISC(2) Leadership Regional Event Series - San Diego, CA April 24 - May 17,  Independent Oracle Users Group Enterprise Data at Risk Seminar Series Tue, Apr 24 IOUG Enterprise Data at Risk Seminar Series - Toronto Wed, Apr 25 IOUG Enterprise Data at Risk Seminar Series - New York Thu, Apr 26 IOUG Enterprise Data at Risk Seminar Series - Boston Thu, Apr 26 ISC(2) Leadership Regional Event Series - San Jose, CA

    Read the article

  • Coding Dynamic Events?

    - by Joey Green
    I have no idea what the title of this question should be so bare with me. My game has turns. On a turn a player does something and this can result in a random number of explosions that occur at different times. I know when the explosions are done. I need to know when ALL are done and then do some other action. Also, each explosion is the same amount of time, say 3 seconds.. Right now I'm thinking of using a counter to hold how many explosions are happening. Then once the explosion is finished decrement this counter. Once the counter is zero, do my action. This idea is inspired by objective-c memory management btw. Anyways, does this sound like a good approach or would there be another way. An alternative might be to figure out the explosion who happened last and let it be responsible for calling this subsequent action. I'm asking mostly, because I haven't done this before and am trying to figure out if there are bugs that may occur that I'm not foreseeing.

    Read the article

  • Is inline SQL still classed as bad practice now that we have Micro ORMs?

    - by Grofit
    This is a bit of an open ended question but I wanted some opinions, as I grew up in a world where inline SQL scripts were the norm, then we were all made very aware of SQL injection based issues, and how fragile the sql was when doing string manipulations all over the place. Then came the dawn of the ORM where you were explaining the query to the ORM and letting it generate its own SQL, which in a lot of cases was not optimal but was safe and easy. Another good thing about ORMs or database abstraction layers were that the SQL was generated with its database engine in mind, so I could use Hibernate/Nhibernate with MSSQL, MYSQL and my code never changed it was just a configuration detail. Now fast forward to current day, where Micro ORMs seem to be winning over more developers I was wondering why we have seemingly taken a U-Turn on the whole in-line sql subject. I must admit I do like the idea of no ORM config files and being able to write my query in a more optimal manner but it feels like I am opening myself back up to the old vulnerabilities such as SQL injection and I am also tying myself to one database engine so if I want my software to support multiple database engines I would need to do some more string hackery which seems to then start to make code unreadable and more fragile. (Just before someone mentions it I know you can use parameter based arguments with most micro orms which offers protection in most cases from sql injection) So what are peoples opinions on this sort of thing? I am using Dapper as my Micro ORM in this instance and NHibernate as my regular ORM in this scenario, however most in each field are quite similar. What I term as inline sql is SQL strings within source code. There used to be design debates over SQL strings in source code detracting from the fundamental intent of the logic, which is why statically typed linq style queries became so popular its still just 1 language, but with lets say C# and Sql in one page you have 2 languages intermingled in your raw source code now. Just to clarify, the SQL injection is just one of the known issues with using sql strings, I already mention you can stop this from happening with parameter based queries, however I highlight other issues with having SQL queries ingrained in your source code, such as the lack of DB Vendor abstraction as well as losing any level of compile time error capturing on string based queries, these are all issues which we managed to side step with the dawn of ORMs with their higher level querying functionality, such as HQL or LINQ etc (not all of the issues but most of them). So I am less focused on the individual highlighted issues and more the bigger picture of is it now becoming more acceptable to have SQL strings directly in your source code again, as most Micro ORMs use this mechanism. Here is a similar question which has a few different view points, although is more about the inline sql without the micro orm context: http://stackoverflow.com/questions/5303746/is-inline-sql-hard-coding

    Read the article

  • WordPress is now nicely supported on SQL Server (and SQL Azure for that matter)

    - by Eric Nelson
    WordPress is enormously popular for blogs and full websites thanks to an awesome eco system which has built up around it, the simplicity (relatively) of getting it up and running plus the flexibility to “bend it” in all sorts of directions. When I say bend, check out the following which are all WordPress sites My “back up blog” http://iupdateable.wordpress.com/  My groups “odd site” :) http://ubelly.com My favourite “cheap games” site http://www.frugalgaming.co.uk/  WordPress users typically run their sites on Linux and MySQL, although PHP (the language in which WordPress is written) can be happily run on Windows. Both fine technologies in their own right, but for me (and probably a fair few others) I would love to use WordPress but with the technologies I know best (aka Windows, IIS and SQL Server). However, that has proven to be actually rather tricky in practice to get working – until now. Earlier last month OmniTI released a patch for WordPress which provides SQL Server and SQL Azure support.  In parallel with that some fine folks inside Microsoft have also created http://wordpress.visitmix.com which contains information about running WordPress on the Microsoft platform with a particular focus on SQL Server and SQL Azure.  Top stuff! To run WordPress with SQL Server: Download and Install the WordPress on SQL Server Distro/Patch And then you will quite likely need to migrate: Check out how to Migrate to Windows and SQL Server by Zach Owens who is moving his blog to Windows and SQL Server Enjoy Related Links Running PHP on IIS on Windows http://php.iis.net/  If PHP is not your thing, then the following Blog engines are .NET based BlogEngine http://www.dotnetblogengine.net/ DasBlog http://www.dasblog.info/ Subtext http://subtextproject.com/ (which happens to power http://geekswithblogs.net where my main blog is http://geekswithblogs.net/iupdateable)

    Read the article

  • SQL Server: Must numbers all be specified with latin numeral digits?

    - by Ian Boyd
    Does SQL server expect numbers to be specified with digits from the latin alphabet, e.g.: 0123456789 Is it valid to give SQL Server digits in other alphabets? Rosetta Stone: Latin: 01234567890 Arabic: ?????????? Bengali: ?????????? i know that the client (ADO) will convert 8-bit strings to 16-bit unicode strings using the current culture. But the client is also converting numbers to strings using their current culture, e.g.: SELECT * FROM Inventory WHERE Quantity > ???,?? Which throws SQL Server for fits. i know that the server/database has it's defined code page and locale, but that is for strings. Will SQL Server interpret numbers using the active (or per-login specified) locale, or must all numeric values be specifid with latin numeral digits?

    Read the article

  • How do you think while formulating Sql Queries. Is it an experience or a concept ?

    - by Shantanu Gupta
    I have been working on sql server and front end coding and have usually faced problem formulating queries. I do understand most of the concepts of sql that are needed in formulating queries but whenever some new functionality comes into the picture that can be dont using sql query, i do usually fails resolving them. I am very comfortable with select queries using joins and all such things but when it comes to DML operation i usually fails For every query that i never done before I usually finds uncomfortable with that while creating them. Whenever I goes for an interview I usually faces this problem. Is it their some concept behind approaching on formulating sql queries. Eg. I need to create an sql query such that A table contain single column having duplicate record. I need to remove duplicate records. I know i can find the solution to this query very easily on Googling, but I want to know how everyone comes to the desired result. Is it something like Practice Makes Man Perfect i.e. once you did it, next time you will be able to formulate or their is some logic or concept behind. I could have get my answer of solving above problem simply by posting it on stackoverflow and i would have been with an answer within 5 to 10 minutes but I want to know the reason. How do you work on any new kind of query. Is it a major contribution of experience or some an implementation of concepts. Whenever I learns some new thing in coding section I tries to utilize it wherever I can use it. But here scenario seems to be changed because might be i am lagging in some concepts.

    Read the article

< Previous Page | 55 56 57 58 59 60 61 62 63 64 65 66  | Next Page >