Search Results

Search found 42327 results on 1694 pages for 'microsoft sql t sql'.

Page 498/1694 | < Previous Page | 494 495 496 497 498 499 500 501 502 503 504 505  | Next Page >

  • PASS Summit – looking back on my first time

    - by Fatherjack
      So I was lucky enough to get my first experience of PASS Summit this year and took some time beforehand to read some blogs and reference material to get an idea on what to do and how to get the best out of my visit. Having been to other conferences – technical and non-technical – I had a reasonable idea on the routine and what to expect in general. Here is a list of a few things that I have learned/remembered as the week has gone by. Wear comfortable shoes. This actually needs to be broadened to Take several pairs of comfortable shoes. You will be spending many many hours, for several days one after another. Having comfortable feet that can literally support you for the duration will make the week in general a whole lot better. Not only at the conference but getting to and from you could well be walking. In the evenings you will be walking around town and standing talking in various bars and clubs. Looking back, on some days I was on my feet for over 20 hours. Make friends. This is a given for the long term benefits it brings but there is also an immediate reward in being at a conference with a friend or two. Some events are bigger and more popular than others and some have the type of session that every single attendee will want to be in. This is great for those that get in but if you are in the bathroom or queuing for coffee and you miss out it sucks. Having a friend that can get in to a room and reserve you a seat is a great advantage to make sure you get the content that you want to see and still have the coffee that you need. Don’t go to every session you want to see This might sound counter intuitive and it relies on the sessions being recorded in some way to guarantee you don’t totally miss out. Both PASS Summit and SQL Bits sessions are recorded (summit is audio, SQLBits is video) and this means that if you get into a good conversation with someone over a coffee you don’t have to break it up to go to a session. Obviously there is a trade-off here and you need to decide on the tipping point for yourself but a conversation at a place like this could make a big difference to the next contract or employer you have or it might simply be great catching up with some friends you don’t see so often. Go to at least one session you don’t want to Again, this will seem to be contrary to normal logic but there is no reason why you shouldn’t learn about a part of SQL Server that isn’t part of your daily routine. Not only will you learn something new but you will also pick up on the feelings and attitudes of the people in the session. So, if you are a DBA, head off to a BI session and so on. You’ll hear BI speakers speaking to a BI audience and get to understand their point of view and reasoning for making the decisions they do. You will also appreciate the way that your decisions and instructions affect the way they have to work. This will help you a lot when you are on a project, working with multiple teams and make you all more productive. Socialise While you are at the conference venue, speak to people. Ask questions, be interested in whoever you are speaking to. You get chances to talk to new friends at breakfast, dinner and every break between sessions. The only people that might not talk to you would be speakers that are about to go and give a session, in most cases speakers like peace and quiet before going on stage. Other than that the people around you are just waiting for someone to talk to them so make the first move. There is a whole lot going on outside of the conference hours and you should make an effort to join in with some of this too. At karaoke evenings or just out for a quiet drink with a few of the people you meet at the conference. Either way, don’t be a recluse and hide in your room or be alone out in the town. Don’t talk to people Once again this sounds wrong but stay with me. I have spoken to a number of speakers since Summit 2013 finished and they have all mentioned the time it has taken them to move about the conference venue due to people stopping them for a chat or to ask a question. 45 minutes to walk from a session room to the speaker room in one case. Wow. While none of the speakers were upset about this sort of delay I think delegates should take the situation into account and possibly defer their question to an email or to a time when the person they want is clearly less in demand. Give them a chance to enjoy the conference in the same way that you are, they may actually want to go to a session or just have a rest after giving their session – talking for 75 minutes is hard work, taking an extra 45 minutes right after is unbelievable. I certainly hope that they get good feedback on their sessions and perhaps if you spoke to a speaker outside a session you can give them a mention in the ‘any other comments’ part of the feedback, just to convey your gratitude for them giving up their time and expertise for free. Say thank you I just mentioned giving the speakers a clear, visible ‘thank you’ in the feedback but there are plenty of people that help make any conference the success it is that would really appreciate hearing that their efforts are valued. People on the registration desk, volunteers giving schedule guidance and directions, people on the community zone are all volunteers giving their time to help you have the best experience possible. Send an email to PASS and convey your thoughts about the work that was done. Maybe you want to be a volunteer next time so you could enquire how you get into that position at the same time. This isn’t an exclusive list and you may agree or disagree with the points I have made, please add anything you think is good advice in the comments. I’d like to finish by saying a huge thank you to all the people involved in planning, facilitating and executing the PASS Summit 2013, it was an excellent event and I know many others think it was a totally worthwhile event to attend.

    Read the article

  • Entity Framework 6: Alpha2 Now Available

    - by ScottGu
    The Entity Framework team recently announced the 2nd alpha release of EF6.   The alpha 2 package is available for download from NuGet. Since this is a pre-release package make sure to select “Include Prereleases” in the NuGet package manager, or execute the following from the package manager console to install it: PM> Install-Package EntityFramework -Pre This week’s alpha release includes a bunch of great improvements in the following areas: Async language support is now available for queries and updates when running on .NET 4.5. Custom conventions now provide the ability to override the default conventions that Code First uses for mapping types, properties, etc. to your database. Multi-tenant migrations allow the same database to be used by multiple contexts with full Code First Migrations support for independently evolving the model backing each context. Using Enumerable.Contains in a LINQ query is now handled much more efficiently by EF and the SQL Server provider resulting greatly improved performance. All features of EF6 (except async) are available on both .NET 4 and .NET 4.5. This includes support for enums and spatial types and the performance improvements that were previously only available when using .NET 4.5. Start-up time for many large models has been dramatically improved thanks to improved view generation performance. Below are some additional details about a few of the improvements above: Async Support .NET 4.5 introduced the Task-Based Asynchronous Pattern that uses the async and await keywords to help make writing asynchronous code easier. EF 6 now supports this pattern. This is great for ASP.NET applications as database calls made through EF can now be processed asynchronously – avoiding any blocking of worker threads. This can increase scalability on the server by allowing more requests to be processed while waiting for the database to respond. The following code shows an MVC controller that is querying a database for a list of location entities:     public class HomeController : Controller     {         LocationContext db = new LocationContext();           public async Task<ActionResult> Index()         {             var locations = await db.Locations.ToListAsync();               return View(locations);         }     } Notice above the call to the new ToListAsync method with the await keyword. When the web server reaches this code it initiates the database request, but rather than blocking while waiting for the results to come back, the thread that is processing the request returns to the thread pool, allowing ASP.NET to process another incoming request with the same thread. In other words, a thread is only consumed when there is actual processing work to do, allowing the web server to handle more concurrent requests with the same resources. A more detailed walkthrough covering async in EF is available with additional information and examples. Also a walkthrough is available showing how to use async in an ASP.NET MVC application. Custom Conventions When working with EF Code First, the default behavior is to map .NET classes to tables using a set of conventions baked into EF. For example, Code First will detect properties that end with “ID” and configure them automatically as primary keys. However, sometimes you cannot or do not want to follow those conventions and would rather provide your own. For example, maybe your primary key properties all end in “Key” instead of “Id”. Custom conventions allow the default conventions to be overridden or new conventions to be added so that Code First can map by convention using whatever rules make sense for your project. The following code demonstrates using custom conventions to set the precision of all decimals to 5. As with other Code First configuration, this code is placed in the OnModelCreating method which is overridden on your derived DbContext class:         protected override void OnModelCreating(DbModelBuilder modelBuilder)         {             modelBuilder.Properties<decimal>()                 .Configure(x => x.HasPrecision(5));           } But what if there are a couple of places where a decimal property should have a different precision? Just as with all the existing Code First conventions, this new convention can be overridden for a particular property simply by explicitly configuring that property using either the fluent API or a data annotation. A more detailed description of custom code first conventions is available here. Community Involvement I blogged a while ago about EF being released under an open source license.  Since then a number of community members have made contributions and these are included in EF6 alpha 2. Two examples of community contributions are: AlirezaHaghshenas contributed a change that increases the startup performance of EF for larger models by improving the performance of view generation. The change means that it is less often necessary to use of pre-generated views. UnaiZorrilla contributed the first community feature to EF: the ability to load all Code First configuration classes in an assembly with a single method call like the following: protected override void OnModelCreating(DbModelBuilder modelBuilder) {        modelBuilder.Configurations            .AddFromAssembly(typeof(LocationContext).Assembly); } This code will find and load all the classes that inherit from EntityTypeConfiguration<T> or ComplexTypeConfiguration<T> in the assembly where LocationContext is defined. This reduces the amount of coupling between the context and Code First configuration classes, and is also a very convenient shortcut for large models. Other upcoming features coming in EF 6 Lots of information about the development of EF6 can be found on the EF CodePlex site, including a roadmap showing the other features that are planned for EF6. One of of the nice upcoming features is connection resiliency, which will automate the process of retying database operations on transient failures common in cloud environments and with databases such as the Windows Azure SQL Database. Another often requested feature that will be included in EF6 is the ability to map stored procedures to query and update operations on entities when using Code First. Summary EF6 is the first open source release of Entity Framework being developed in CodePlex. The alpha 2 preview release of EF6 is now available on NuGet, and contains some really great features for you to try. The EF team are always looking for feedback from developers - especially on the new features such as custom Code First conventions and async support. To provide feedback you can post a comment on the EF6 alpha 2 announcement post, start a discussion or file a bug on the CodePlex site. Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • Subsonic 3 LINQ vs LINQ to SQL

    - by Jamil
    Hi, I am using SQL Server 2005 in a project. I have to decide about datalayer. I would like to use LINQ in my project. I saw SubSonic 3 supporting LINQ and I also have option for LINQ to SQL, because i can have typed lists from LINQ to SQL. I am wondering what is different between LINQ to SQL and Subsoinc 3 LINQ, Which is beneficial? Thanks! JAMIL

    Read the article

  • Drupal SQL injection attacks prevention and apostrophe handling in Forms

    - by jini
    in typical PHP applications I used to use mysql_real_escape_string before I did SQL inserts. However I am unable to do that in Drupal so would need some assistance. And without any sort of function like that, user input with apostrophes is breaking my code. Please suggest. Thank You My SQL is as follows: $sql = "INSERT INTO some_table (field1, field2) VALUES ('$field1', '$field2')"; db_query($sql);

    Read the article

  • Am I missing something about LINQ?

    - by Jason Baker
    I seem to be missing something about LINQ. To me, it looks like it's taking some of the elements of SQL that I like the least and moving them into the C# language and using them for other things. I mean, I could see the benefit of using SQL-like statements on things other than databases. But if I wanted to write SQL, well, why not just write SQL and keep it out of C#? What am I missing here?

    Read the article

  • NHibernate + Sql Compact + IoC - Connection Managment

    - by Michael
    When working with NHibernate and Sql Compact in a Windows Form application I am wondering what is the best practice for managing connections. With SQL CE I have read that you should keep your connection open vs closing it as one would typically do with standard SQL. If that is the case and your using a IoC, would you make your repositories lifetime be singletons so they exist forever or dispose of them after you perform a "Unit of Work". Also is there a way to determine the number of connections open to Sql CE?

    Read the article

  • PL/SQL execption and Java programs

    - by edwards
    Hi Business logic is coded in pl/sql paackages procedures and functions. Java programs call pl/sql packages procedures and functions to do database work. Issue now is pl/sql programs store excpetions into Oracle tables whenever a execption is raised. How would my java programs get the execptions since the exception instead of being propogated from pl/sql to java is getting persisted to a oracle table.

    Read the article

  • Telling How Much Space SQLCE Tables are Using

    - by peter
    Is there a way to get a record count or a space-used value for all the tables in a SQL Server Compact database file (SQL CE 3.5.1)? I saw this question, http://stackoverflow.com/questions/2075026/sql-server-compact-count-records-get-space-used-for-all-tables-in-database So perhaps another option would be, is there an easy way to move a sqlce database to SQL Express? Would that give realistic measurement of the database table sizes?

    Read the article

  • Which is the "best" data access framework/approach for C# and .NET?

    - by Frans
    (EDIT: I made it a community wiki as it is more suited to a collaborative format.) There are a plethora of ways to access SQL Server and other databases from .NET. All have their pros and cons and it will never be a simple question of which is "best" - the answer will always be "it depends". However, I am looking for a comparison at a high level of the different approaches and frameworks in the context of different levels of systems. For example, I would imagine that for a quick-and-dirty Web 2.0 application the answer would be very different from an in-house Enterprise-level CRUD application. I am aware that there are numerous questions on Stack Overflow dealing with subsets of this question, but I think it would be useful to try to build a summary comparison. I will endeavour to update the question with corrections and clarifications as we go. So far, this is my understanding at a high level - but I am sure it is wrong... I am primarily focusing on the Microsoft approaches to keep this focused. ADO.NET Entity Framework Database agnostic Good because it allows swapping backends in and out Bad because it can hit performance and database vendors are not too happy about it Seems to be MS's preferred route for the future Complicated to learn (though, see 267357) It is accessed through LINQ to Entities so provides ORM, thus allowing abstraction in your code LINQ to SQL Uncertain future (see Is LINQ to SQL truly dead?) Easy to learn (?) Only works with MS SQL Server See also Pros and cons of LINQ "Standard" ADO.NET No ORM No abstraction so you are back to "roll your own" and play with dynamically generated SQL Direct access, allows potentially better performance This ties in to the age-old debate of whether to focus on objects or relational data, to which the answer of course is "it depends on where the bulk of the work is" and since that is an unanswerable question hopefully we don't have to go in to that too much. IMHO, if your application is primarily manipulating large amounts of data, it does not make sense to abstract it too much into objects in the front-end code, you are better off using stored procedures and dynamic SQL to do as much of the work as possible on the back-end. Whereas, if you primarily have user interaction which causes database interaction at the level of tens or hundreds of rows then ORM makes complete sense. So, I guess my argument for good old-fashioned ADO.NET would be in the case where you manipulate and modify large datasets, in which case you will benefit from the direct access to the backend. Another case, of course, is where you have to access a legacy database that is already guarded by stored procedures. ASP.NET Data Source Controls Are these something altogether different or just a layer over standard ADO.NET? - Would you really use these if you had a DAL or if you implemented LINQ or Entities? NHibernate Seems to be a very powerful and powerful ORM? Open source Some other relevant links; NHibernate or LINQ to SQL Entity Framework vs LINQ to SQL

    Read the article

  • SSIS package in SQL Server 2005 can send data to SQL Server 2008?

    - by needshelp
    Hi SSIS Geniuses, I Have an SSIS package in SQL Server 2005 that takes data from a flat file and puts it in a table in SQL Server 2005. Now I want it to send it to an additional location along with the 2005 location. This new table is in SQL Server 2008. Can it be done without porting my SSIS package to SQL Server 2008? Thank you for your help in advance.

    Read the article

  • In a SQL XDL File, how do I read the waitresource attribute on process nodes which are deadlocking?

    - by skimania
    On SQL Server 2005, I'm getting a deadlock when updating two different keys in the same table. note from below that these two waitresources have the same beginning part, but different ending parts. waitresource="KEY: 6:72057594090487808 (d900ed5a6cc6)" and waitresource="KEY: 6:72057594090487808 (d900fb5261bb)" These two keys are locking, and I need to figure out why. The question: If the values in parenthesis are different, why are the first half of the key's the same? <deadlock-list> <deadlock victim="processffffffff8f5863e8"> <process-list> <process id="processaf02f8" taskpriority="0" logused="0" waitresource="KEY: 6:72057594090487808 (d900fb5261bb)" waittime="2281" ownerId="1370264705" transactionname="user_transaction" lasttranstarted="2010-06-17T00:35:25.483" XDES="0x69453a70" lockMode="U" schedulerid="3" kpid="7624" status="suspended" spid="339" sbid="0" ecid="0" priority="0" transcount="2" lastbatchstarted="2010-06-17T00:35:25.483" lastbatchcompleted="2010-06-17T00:35:25.483" clientapp=".Net SqlClient Data Provider" hostname="RISKBBG_VM" hostpid="5848" loginname="RiskOpt" isolationlevel="read committed (2)" xactid="1370264705" currentdb="6" lockTimeout="4294967295" clientoption1="671088672" clientoption2="128056"> <executionStack> <frame procname="MKP_RISKDB.dbo.MarketDataCurrentRtUpload" line="14" stmtstart="840" stmtend="1220" sqlhandle="0x03000600005f9d24c8878f00849d00000100000000000000"> UPDATE c WITH (ROWLOCK) SET LastUpdate = t.LastUpdate, Value = t.Value, Source = t.Source FROM MarketDataCurrent c INNER JOIN #TEMPTABLE2 t ON c.MDID = t.mdid; -- Insert new MDID </frame> <frame procname="adhoc" line="1" sqlhandle="0x010006004a58132228bf8d73000000000000000000000000"> MarketDataCurrentBlbgRtUpload </frame> </executionStack> <inputbuf> MarketDataCurrentBlbgRtUpload </inputbuf> </process> <process id="processffffffff8f5863e8" taskpriority="0" logused="0" waitresource="KEY: 6:72057594090487808 (d900ed5a6cc6)" waittime="2281" ownerId="1370264646" transactionname="user_transaction" lasttranstarted="2010-06-17T00:35:25.450" XDES="0x1cb72be8" lockMode="U" schedulerid="5" kpid="1880" status="suspended" spid="287" sbid="0" ecid="0" priority="0" transcount="2" lastbatchstarted="2010-06-17T00:35:25.450" lastbatchcompleted="2010-06-17T00:35:25.450" clientapp=".Net SqlClient Data Provider" hostname="RISKAPPS_VM" hostpid="1424" loginname="RiskOpt" isolationlevel="read committed (2)" xactid="1370264646" currentdb="6" lockTimeout="4294967295" clientoption1="671088672" clientoption2="128056"> <executionStack> <frame procname="MKP_RISKDB.dbo.MarketDataCurrent_BulkUpload" line="28" stmtstart="1062" stmtend="1720" sqlhandle="0x03000600a28e5e4ef4fd8e00849d00000100000000000000"> UPDATE c WITH (ROWLOCK) SET LastUpdate = getdate(), Value = t.Value, Source = @source FROM MarketDataCurrent c INNER JOIN #MDTUP t ON c.MDID = t.mdid WHERE c.lastUpdate &lt; @updateTime and c.mdid not in (select mdid from MarketData where BloombergTicker is not null and PriceSource like &apos;Live.%&apos;) and c.value &lt;&gt; t.value </frame> <frame procname="adhoc" line="1" stmtstart="88" sqlhandle="0x01000600c1653d0598706ca7000000000000000000000000"> exec MarketDataCurrent_BulkUpload @clearBefore, @source </frame> <frame procname="unknown" line="1" sqlhandle="0x000000000000000000000000000000000000000000000000"> unknown </frame> </executionStack> <inputbuf> (@clearBefore datetime,@source nvarchar(10))exec MarketDataCurrent_BulkUpload @clearBefore, @source </inputbuf> </process> </process-list> <resource-list> <keylock hobtid="72057594090487808" dbid="6" objectname="MKP_RISKDB.dbo.MarketDataCurrent" indexname="PK_MarketDataCurrent" id="lock64ac7940" mode="U" associatedObjectId="72057594090487808"> <owner-list> <owner id="processffffffff8f5863e8" mode="U"/> </owner-list> <waiter-list> <waiter id="processaf02f8" mode="U" requestType="wait"/> </waiter-list> </keylock> <keylock hobtid="72057594090487808" dbid="6" objectname="MKP_RISKDB.dbo.MarketDataCurrent" indexname="PK_MarketDataCurrent" id="lockffffffffb8d2dd40" mode="U" associatedObjectId="72057594090487808"> <owner-list> <owner id="processaf02f8" mode="U"/> </owner-list> <waiter-list> <waiter id="processffffffff8f5863e8" mode="U" requestType="wait"/> </waiter-list> </keylock> </resource-list> </deadlock> </deadlock-list>

    Read the article

  • Does LearnDevNow offers a useful subscription for .NET related training : with video and text reference material?

    - by user766926
    Does LearnDevNow offers a useful subscription for .NET video and text reference training material or to learn .NET? I could not find a review of www.learnnowonline.com/learndevnow I've been doing research about places that offer .NET / ASP.NET / C# / LINQ / SQL / JQuery video training that provide excellent material for developers. Kind of like a Lynda.com for back end development. However, I have not find one place that offers quality material that is easily accessible for the user, and that is priced competitive. This is what I'm looking for: Online Video tutorials with free future updates (videos that can be accessed on any device ie: android portable devices, mac / ipad, linux machines) Printable courseware (ie PDFs so that you can take notes, print if necessary, and read in a tablet in case you don;t have internet access) labs, and easy to access code Pre/Post Assessments/Exams for each training to keep track of your progress and what you have learned/skills. (Appdev used to offer this but it was way too expensive (thousands for each training ie 1k for each. I paid about five thousand at Appdev, and now I regret that purchase), and after a few years/months the training became obsolete - outdated) I looked at learnnowonline.com/learndevnow but it seems that their courseware / text reference material can only be accessed online, and don't know if their videos work in all mobile devices, and browsers (Safari, Chrome, IE, Opera, Firefox) Also, it seems Appdev not longer exist and now is called " AppDev is now LearnNowPlus : www.learnnowonline.com/appdev That site not longer offers prices. I tried to find reviews but could not find any. Does anyone know, or can share a review about some of these type of online training service providers? I would appreciate your feedback on this, and if you can share your past experiences with their service or similar/better services that would even be better. Thanks.

    Read the article

  • Discount Codes Galore

    - by Cassandra Clark
    Saving money is at the top of everyones list right now. With this in mind the Oracle Technology Network team has compiled a list of discounts available at the Oracle Store. We are also introducing an Oracle Technology Network member discount from O'Reilly Media. If you subscribe to any of the Oracle Technology newsletters you also saw special discounts from CRC Press, Packt Publishing and Apress. We are going to do our best to bring you more offers like this every month. Now on to the discounts... Oracle Store offers - all below expiring May 31st 2010. Don't miss out! Expand Your Productivity with Oracle Open Office and Save 15%? Enter OTNOffice at checkout. Buy Now! Drive Business Agility and Performance with Industry-leading Oracle Database Management Packs.  Save 10% when you purchase them at the Oracle Store. Enter OTNDBMP at checkout. Buy Now! 15% Savings on Oracle Virtualization and Unbreakable Linux Support at the Oracle Store Enter code OTNLinuxVM at checkout. Buy Now! 20% Savings on Oracle SQL Developer Data Modeler Use OTNSQL at checkout. Buy Now! O'Reilly Oracle Technology Network Member Offer O'Reilly is generously offering Oracle Technology Network Members 35% off for print books and 40% off of eBooks. Browse Oracle titles at- http://oreilly.com/pub/topic/oracle. Use discount code TECNT at checkout.

    Read the article

  • Oracle pl\sql question for my homework in oracle 11G class [migrated]

    - by Bjolds
    I am new to oracle 11G programming and i have run into a tough situation with pl\sql funtions and automation. I ame unsure how to create the function for the automation of Registration system for a College registration system. Here is what i want to do. I want to automate the registrations system so that it automaticly registers students. Then I want a procedure to automate the grading system. I have included the code that i am written to make most of this assignment work which it does but unsure how to incorporate Pl\SQL automated fuctions for the registrations system, and the grading system. So Any help or Ideas I would greatly appreciate please. set Linesize 250 set pagesize 150 drop table student; drop table faculty; drop table Course; drop table Section; drop table location; DROP TABLE courseInstructor; DROP TABLE Registration; DROP TABLE grade; create table student( studentid number(10), Lastname varchar2(20), Firstname Varchar2(20), MI Char(1), address Varchar2(20), city Varchar2(20), state Char(2), zip Varchar2(10), HomePhone Varchar2(10), Workphone Varchar2(10), DOB Date, Pin VARCHAR2(10), Status Char(1)); ALTER TABLE Student Add Constraint Student_StudentID_pk Primary Key (studentID); Insert into student values (1,'xxxxxxxx','xxxxxxxxxx','x','xxxxxxxxxxxxxxx','Columbus','oh','44159','xxx-xxx-xxxx','xxx-xxx-xxxx','06-Mar-1957','1211','c'); create table faculty( FacultyID Number(10), FirstName Varchar2(20), Lastname Varchar2(20), MI Char(1), workphone Varchar2(10), CellPhone Varchar2(10), Rank Varchar2(20), Experience Varchar2(10), Status Char(1)); ALTER TABLE Faculty ADD Constraint Faculty_facultyId_PK PRIMARY KEY (FacultyID); insert into faculty values (1,'xxx','xxxxxxxxxxxx',xxx-xxx-xxxx','xxx-xxx-xxxx','professor','20','f'); create table Course( CourseId number(10), CourseNumber Varchar2(20), CourseName Varchar(20), Description Varchar(20), CreditHours Number(4), Status Char(1)); ALTER TABLE Course ADD Constraint Course_CourseID_pk PRIMARY KEY(CourseID); insert into course values (1,'cit 100','computer concepts','introduction to PCs','3.0','o'); insert into course values (2,'cit 101','Database Program','Database Programming','4.0','o'); insert into course values (3,'Math 101','Algebra I','Algebra I Concepts','5.0','o'); insert into course values (4,'cit 102a','Pc applications','Aplications 1','3.0','o'); insert into course values (5,'cit 102b','pc applications','applications 2','3.0','o'); insert into course values (6,'cit 102c','pc applications','applications 3','3.0','o'); insert into course values (7,'cit 103','computer concepts','introduction systems','3.0','c'); insert into course values (8,'cit 110','Unified language','UML design','3.0','o'); insert into course values (9,'cit 165','cobol','cobol programming','3.0','o'); insert into course values (10,'cit 167','C++ Programming 1','c++ programming','4.0','o'); insert into course values (11,'cit 231','Expert Excel','spreadsheet apps','3.0','o'); insert into course values (12,'cit 233','expert Access','database devel.','3.0','o'); insert into course values (13,'cit 169','Java Programming I','Java Programming I','3.0','o'); insert into course values (14,'cit 263','Visual Basic','Visual Basic Prog','3.0','o'); insert into course values (15,'cit 275','system analysis 2','System Analysis 2','3.0','o'); create table Section( SectionID Number(10), CourseId Number(10), SectionNumber VarChar2(10), Days Varchar2(10), StartTime Date, EndTime Date, LocationID Number(10), SeatAvailable Number(3), Status Char(1)); ALTER TABLE Section ADD Constraint Section_SectionID_PK PRIMARY KEY(SectionID); insert into section values (1,1,'18977','r','21-Sep-2011','10-Dec-2011','1','89','o'); create table Location( LocationId Number(10), Building Varchar2(20), Room Varchar2(5), Capacity Number(5), Satus Char(1)); ALTER TABLE Location ADD Constraint Location_LocationID_pk PRIMARY KEY (LocationID); insert into Location values (1,'Clevleand Hall','cl209','35','o'); insert into Location values (2,'Toledo Circle','tc211','45','o'); insert into Location values (3,'Akron Square','as154','65','o'); insert into Location values (4,'Cincy Hall','ch100','45','o'); insert into Location values (5,'Springfield Dome','SD','35','o'); insert into Location values (6,'Dayton Dorm','dd225','25','o'); insert into Location values (7,'Columbus Hall','CB354','15','o'); insert into Location values (8,'Cleveland Hall','cl204','85','o'); insert into Location values (9,'Toledo Circle','tc103','75','o'); insert into Location values (10,'Akron Square','as201','46','o'); insert into Location values (11,'Cincy Hall','ch301','73','o'); insert into Location values (12,'Dayton Dorm','dd245','57','o'); insert into Location values (13,'Springfield Dome','SD','65','o'); insert into Location values (14,'Cleveland Hall','cl241','10','o'); insert into Location values (15,'Toledo Circle','tc211','27','o'); insert into Location values (16,'Akron Square','as311','28','o'); insert into Location values (17,'Cincy Hall','ch415','73','o'); insert into Location values (18,'Toledo Circle','tc111','67','o'); insert into Location values (19,'Springfield Dome','SD','69','o'); insert into Location values (20,'Dayton Dorm','dd211','45','o'); Alter Table Student Add Constraint student_Zip_CK Check(Rtrim (Zip,'1234567890-') is null); Alter Table Student ADD Constraint Student_Status_CK Check(Status In('c','t')); Alter Table Student ADD Constraint Student_MI_CK2 Check(RTRIM(MI,'abcdefghijklmnopqrstuvwxyz')is Null); Alter Table Student Modify pin not Null; Alter table Faculty Add Constraint Faculty_Status_CK Check(Status In('f','a','i')); Alter table Faculty ADD Constraint Faculty_Rank_CK Check(Rank In ('professor','doctor','instructor','assistant','tenure')); Alter table Faculty ADD Constraint Faculty_MI_CK2 Check(RTRIM(MI,'abcdefghijklmnopqrstuvwxyz')is Null); Update Section Set Starttime = To_date('09-21-2011 6:00 PM', 'mm-dd-yyyy hh:mi pm'); Update Section Set Endtime = To_date('12-10-2011 9:50 PM', 'mm-dd-yyyy hh:mi pm'); alter table Section Add Constraint StartTime_Status_CK Check (starttime < Endtime); Alter Table Section Add Constraint Section_StartTime_ck check (StartTime < EndTime); Alter Table Section ADD Constraint Section_CourseId_FK FOREIGN KEY (CourseID) References Course(CourseId); Alter Table Section ADD Constraint Section_LocationID_FK FOREIGN KEY (LocationID) References Location (LocationId); Alter Table Section ADD Constraint Section_Days_CK Check(RTRIM(Days,'mtwrfsu')IS Null); update section set seatavailable = '99'; Alter Table Section ADD Constraint Section_SeatsAvailable_CK Check (SeatAvailable < 100); Alter Table Course Add Constraint Course_CreditHours_ck check(CreditHours < = 6.0); update location set capacity = '99'; Alter Table Location Add Constraint Location_Capacity_CK Check(Capacity < 100); Create Table Registration ( StudentID Number(10), SectionID Number(10), Constraint Registration_pk Primary key (studentId, Sectionid)); Insert into registration values (1, 2); Insert into Registration values (2, 3); Insert into registration values (3, 4); Insert into registration values (4, 5); Insert into registration values (5, 6); Insert into registration values (6, 7); Insert into registration values (7, 8); Insert into registration values (8, 9); insert into registration values (9, 10); insert into registration values (10, 11); insert into registration values (9, 12); insert into registration values (8, 13); insert into registration values (7, 14); insert into registration values (6, 15); insert into registration values (5, 17); insert into registration values (4, 18); insert into registration values (3, 19); insert into registration values (2, 20); insert into registration values (1, 21); insert into registration values (2, 22); insert into registration values (3, 23); insert into registration values (4, 24); insert into registration values (5, 25); Insert into registration values (6, 24); insert into registration values (7, 23); insert into registration values (8, 22); insert into registration values (9, 21); insert into registration values (10, 20); insert into registration values (9, 19); insert into registration values (8, 17); Create Table courseInstructor( FacultyID Number(10), SectionID Number(10), Constraint CourseInstructor_pk Primary key (FacultyId, SectionID)); insert into courseInstructor values (1, 1); insert into courseInstructor values (2, 2); insert into courseInstructor values (3, 3); insert into courseInstructor values (4, 4); insert into courseInstructor values (5, 5); insert into courseInstructor values (5, 6); insert into courseInstructor values (4, 7); insert into courseInstructor values (3, 8); insert into courseInstructor values (2, 9); insert into courseInstructor values (1, 10); insert into courseInstructor values (5, 11); insert into courseInstructor values (4, 12); insert into courseInstructor values (3, 13); insert into courseInstructor values (2, 14); insert into courseInstructor values (1, 15); Create table grade( StudentID Number(10), SectionID Number(10), Grade Varchar2(1), Constraint grade_pk Primary key (StudentID, SectionID)); CREATE OR REPLACE TRIGGER TR_CreateGrade AFTER INSERT ON Registration FOR EACH ROW BEGIN INSERT INTO grade (SectionID,StudentID,Grade) VALUES(:New.SectionID,:New.StudentID,NULL); END TR_createGrade; / CREATE OR REPLACE FORCE VIEW V_reg_student_course AS SELECT Registration.StudentID, student.LastName, student.FirstName, course.CourseName, Registration.SectionID, course.CreditHours, section.Days, TO_CHAR(StartTime, 'MM/DD/YYYY') AS StartDate, TO_CHAR(StartTime, 'HH:MI PM') AS StartTime, TO_CHAR(EndTime, 'MM/DD/YYYY') AS EndDate, TO_CHAR(EndTime, 'HH:MI PM') AS EndTime, location.Building, location.Room FROM registration, student, section, course, location WHERE registration.StudentID = student.StudentID AND registration.SectionID = section.SectionID AND section.LocationID = location.LocationID AND section.CourseID = course.CourseID; CREATE OR REPLACE FORCE VIEW V_teacher_to_course AS SELECT courseInstructor.FacultyID, faculty.FirstName, faculty.LastName, courseInstructor.SectionID, section.Days, TO_CHAR(StartTime, 'MM/DD/YYYY') AS StartDate, TO_CHAR(StartTime, 'HH:MI PM') AS StartTime, TO_CHAR(EndTime, 'MM/DD/YYYY') AS EndDate, TO_CHAR(EndTime, 'HH:MI PM') AS EndTime, location.Building, location.Room FROM courseInstructor, faculty, section, course, location WHERE courseInstructor.FacultyID = faculty.FacultyID AND courseInstructor.SectionID = section.SectionID AND section.LocationID = location.LocationID AND section.CourseID = course.CourseID; SELECT * FROM V_reg_student_course; SELECT * FROM V_teacher_to_course;

    Read the article

  • New P6 Reporting Database R2

    - by mark.kromer
    Along with our announced GA release of P6 Analytics R1 recently, you may have noticed that when you purchase P6 Analytics, we provide a restricted use license for P6 Reporting Database R2. This represent an updated version of the previous P6 Reporting Database 6.2 and can be purchased individually on a per-CPU basis. Typically, you will want just the reporting database if you would like the P6 data warehouse components such as the ETL, data models, ODS and star schemas in order to report on that data with another reporting tool other than Oracle. The P6 Analytics solution will only work on Oracle BI (OBI). But I pasted below some examples of a simplistic matrix report that I built from the P6 Reporting Database using Microsoft SQL Server Reporting Services. This is the Report Builder tool which is very similar to other similar tools to build reports on the market today such as Crystal Reports or Oracle BI Publisher. This is an example of what you can do (in a very simple format) by using the P6 Reporting Database without P6 Analytics: Here is a quick run-down of some of the key new features in P6 Reporting Database R2 that were added as enhancements to the 6.2 version: • 4 new star schemas (improved projects star, project history, resource utilization and resource allocation) • Improved ETL performance and reliability • P6 security is inherited at the star schema level • Custom P6 project, activity & resource codes are now available as customizable dimensions in the star schemas • Time-phase data down to the data is now available from the star schemas • An updated Operational Data Store (ODS) for operational reporting that includes the WBS hierarchy • The ODS now includes daily spreads for activity and resource assignments

    Read the article

  • ASP.NET4.0-Compatibility Settings for rendering controls

    - by Jalpesh P. Vadgama
    With asp.net 4.0 Microsoft has taken a great step for rendering controls. Now it will have more cleaner html there are lots of enhancement for rendering html controls in asp.net 4.0 now all controls like Menu, List View and other controls renders more cleaner html. But recently i have faced strange problem in rendering controls I have my site in asp.net 3.5 and i want to convert it in asp.net 4.0. I have applied my style as per 3.5 rendering and some of items are obsolete in asp.net 4.0. Modifying style sheet was a tedious job here asp.net 4.0 compatibility  setting comes into help. Asp.net 4.0 compatibility settings provides full backward compatibility in terms of the rendering controls. You can assign this in your web.config section like following. XML, using GeSHi 1.0.8.6<system.web> <pages controlRenderingCompatibilityVersion="3.5|4.0"/> </system.web>  Parsed in 0.001 seconds at 84.92 KB/s Here the values of controlRenderingCompatibility is a string which will indicate on which way control should render in browser if you provide 4.0 then it will controls with more cleaner html and while if you want to go with old legacy rendering like 3.5 then you can put 3.5 and it will render same way as you are doing in asp.net 3.5. Hope this help you!!! Technorati Tags: ASP.NET 4.0,controlRenderingCompatibility

    Read the article

  • ODI 11g – How to override SQL at runtime?

    - by David Allan
    Following on from the posting some time back entitled ‘ODI 11g – Simple, Powerful, Flexible’ here we push the envelope even further. Rather than just having the SQL we override defined statically in the interface design we will have it configurable via a variable….at runtime. Imagine you have a well defined interface shape that you want to be fulfilled and that shape can be satisfied from a number of different sources that is what this allows - or the ability for one interface to consume data from many different places using variables. The cool thing about ODI’s reference API and this is that it can be fantastically flexible and useful. When I use the variable as the option value, and I execute the top level scenario that uses this temporary interface I get prompted (or can get prompted to be correct) for the value of the variable. Note I am using the <@=odiRef.getObjectName("L","EMP", "SCOTT","D")@> notation for the table reference, since this is done at runtime, then the context will resolve to the correct table name etc. Each time I execute, I could use a different source provider (obviously some dependencies on KMs/technologies here). For example, the following groovy snippet first executes and the query uses SCOTT model with EMP, the next time it is from BOB model and the datastore OTHERS. m=new Properties(); m.put("DEMO.SQLSTR", "select empno, deptno from <@=odiRef.getObjectName("L","EMP", "SCOTT","D")@>"); s=new StartupParams(m); runtimeAgent.startScenario("TOP", null, s, null, "GLOBAL", 5, null, true); m2=new Properties(); m2.put("DEMO.SQLSTR", "select empno, deptno from <@=odiRef.getObjectName("L","OTHERS", "BOB","D")@>"); s2=new StartupParams(m); runtimeAgent.startScenario("TOP", null, s2, null, "GLOBAL", 5, null, true); You’ll need a patch to 11.1.1.6 for this type of capability, thanks to my ole buddy Ron Gonzalez from the Enterprise Management group for help pushing the envelope!

    Read the article

  • Backing up SQL Azure

    - by Herve Roggero
    That's it!!! After many days and nights... and an amazing set of challenges, I just released the Enzo Backup for SQL Azure BETA product (http://www.bluesyntax.net). Clearly, that was one of the most challenging projects I have done so far. Why??? Because to create a highly redundant system, expecting failures at all times for an operation that could take anywhere from a couple of minutes to a couple of hours, and still making sure that the operation completes at some point was remarkably challenging. Some routines have more error trapping that actual code... Here are a few things I had to take into account: Exponential Backoff (explained in another post) Dual dynamic determination of number of rows to backup  Dynamic reduction of batch rows used to restore the data Implementation of a flexible BULK Insert API that the tool could use Implementation of a custom Storage REST API to handle automatic retries Automatic data chunking based on blob sizes Compression of data Implementation of the Task Parallel Library at multiple levels including deserialization of Azure Table rows and backup/restore operations Full or Partial Restore operations Implementation of a Ghost class to serialize/deserialize data tables And that's just a partial list... I will explain what some of those mean in future blob posts. A lot of the complexities had to do with implementing a form of retry logic, depending on the resource and the operation.

    Read the article

  • Ribbon Search: Locate MS Office Ribbon Menu Features/Functions Quickly

    - by Kavitha
    In the new versions of Microsoft Office  everything has changed with the introduction of Ribbon menus. Even though Ribbon menus has many advantages that simplifies accessing features, at times it’s a daunting task to navigate the Ribbon menus and find a specific command. Ribbon search is one of the interesting freeware tools to overcome these complaints from users, with this one can search Office ribbon for any feature or function easily. It supports both Office 2007 and  Office 2010(the versions which have ribbon). Once Installation has completed, you can find a text box on top of the ribbon in all the office applications (Outlook, Word, PowerPoint, Excel etc.). As you type few letters of the feature you are looking for, Ribbon Search instantly displays the path through which you can access the feature. Here is a screen grab search of Ribbon Search in action When you start typing itself shows results instantly. And also it gives the path through which you can access feature you are searching for. If there are multiple ways to access the feature, it is also shown in the list. Download Ribbon Search This article titled,Ribbon Search: Locate MS Office Ribbon Menu Features/Functions Quickly, was originally published at Tech Dreams. Grab our rss feed or fan us on Facebook to get updates from us.

    Read the article

  • Improving the performance of a db import process

    - by mmr
    I have a program in Microsoft Access that processes text and also inserts data in MySQL database. This operation takes 30 mins or less to finished. I translated it into VB.NET and it takes 2 hours to finish. The program goes like this: A text file contains individual swipe from a corresponding person, it contains their id, time and date of swipe in the machine, and an indicator if it is a time-in or a time-out. I process this text, segregate the information and insert the time-in and time-out per row. I also check if there are double occurrences in the database. After checking, I simply merge the time-in and time-out of the corresponding person into one row only. This process takes 2 hours to finished in VB.NET considering I have a table to compare which contains 600,000+ rows. Now, I read in the internet that python is best in text processing, i already have a test but i doubt in database operation. What do you think is the best programming language for this kind of problem? How can I speed up the process? My first idea was using python instead of VB.NET, but since people here telling me here on SO that this most probably won't help I am searching for different solutions.

    Read the article

  • Design and Print Your Own Christmas Cards in MS Word, Part 2: How to Print

    - by Eric Z Goodnight
    Creating greeting cards can be a lot of DIY fun around the holidays, but printing them can often be a nightmare. This simple How-To will show you how to figure out how to perfectly print your half fold card. Last week we showed you how to create a simple, attractive greeting card in Microsoft Word using Creative Commons images and basic fonts. If you missed out, it is still available, and the Word template used here can still be downloaded. If you have already made your Christmas card and are struggling to get it printed right, then this simple How-To is for you Latest Features How-To Geek ETC The How-To Geek Guide to Learning Photoshop, Part 8: Filters Get the Complete Android Guide eBook for Only 99 Cents [Update: Expired] Improve Digital Photography by Calibrating Your Monitor The How-To Geek Guide to Learning Photoshop, Part 7: Design and Typography How to Choose What to Back Up on Your Linux Home Server How To Harmonize Your Dual-Boot Setup for Windows and Ubuntu Hang in There Scrat! – Ice Age Wallpaper How Do You Know When You’ve Passed Geek and Headed to Nerd? On The Tip – A Lamborghini Theme for Chrome and Iron What if Wile E. Coyote and the Road Runner were Human? [Video] Peaceful Winter Cabin Wallpaper Store Tabs for Later Viewing in Opera with Tab Vault

    Read the article

  • Error MSB4019: The imported project "C:\Program Files\MSBuild\Microsoft\VisualStudio\v10.0\WebApplications\Microsoft.WebApplication.targets" was not found. Confirm that the path in the <Import> declaration is correct, and that the file exists on disk

    - by Tim Huffam
    This error occurred on our TFS2008 build server which we had upgraded to cater for VS2010 projects (by installing VS2010 on the build server - see this article). Error MSB4019: The imported project "C:\Program Files\MSBuild\Microsoft\VisualStudio\v10.0\WebApplications\Microsoft.WebApplication.targets" was not found. Confirm that the path in the <Import> declaration is correct, and that the file exists on disk. However - although we had installed VS2010 on the build server - we had not installed the web development components (Visual Web Developer) - this is what caused the error. To fix - simply add the web development components: Go into Control Panel - Add or Remove Programs Select Microsoft Visual Studio 2010, and click on Change/Remove In the VS Maintenance Mode screens, select Add or Remove Features In the Setup - Options page make sure 'Visual Web Developer' is checked. Click on Update.   You shouldn't need to restart your build service. HTH Tim

    Read the article

  • I spoke at SQL Saturday #77 and all I got was this really awesome speaker's shirt!

    - by Most Valuable Yak (Rob Volk)
    Yeah, it was 2 weeks ago, but I'm finally blogging about something! I presented Revenge: The SQL! at SQL Saturday #77 in Pensacola on June 4.  The session abstract is here, and you can download the slides from that page too.  You can see how I look in the speaker's shirt here. Overall it went pretty well.  I discovered a new bit of evil just that morning and in a carefully considered, agonizing decision-making process that was full documented, tested, and approved…nah, I just went ahead and added it at the last minute.  Which worked out even better than (not) planned, since it screwed me up a bit and made my point perfectly.  I had a few fans in the audience, and one of them recorded it for blackmail material posterity. I'd like to thank Karla Landrum (blog | twitter) and all the volunteers for putting together such a great event, and for being kind enough to let me present. (Note to Karla: I'll get the next $100 to you as soon as I can.  Might need a few extra days on the next $100.) Thanks to Audrey (blog | twitter), Peg, and Dorothy for attending and keeping the heckling down.  Thanks also to Aaron (blog | twitter) for providing room and board and also not heckling.  Thanks to Julie (blog | twitter) for coming up with the title for the presentation.  (boo to Julie for getting sick and bailing out on us)  And thanks to all of them for listening to a preview and offering their suggestions and advice! Cross your fingers that I get accepted at SQL Saturday 81 in Birmingham, SQL Saturday 85 in Orlando, or SQL Saturday 89 in Atlanta, or just attend them anyway!

    Read the article

  • Designing a large database with multiple sources

    - by CatchingMonkey
    I have been tasked with redesigning, or at worst optimising the structure of a database for a data warehouse. Currently, the database has 4 other source databases (which is due to expand to X number of others), all of which have their own data structures, naming conventions etc. At the moment an overnight SSIS package pulls the data from the various source and then for each source coverts the data into a standardised, usable format. These tables are then appended to each other creating a 60m row, 40 column beast!. This table is then used in a variety of ways from an OLAP cube to a web front end. The structure has been in place for a very long time, and the work I have been able to prove the advantages of normalisation, and this is the way I would like to go. The problem for me is that the overnight process takes so long I don't then want to spend additional time normalising the last table into something usable. Can anyone offer any insight or ideas into the best way to restructure or optimise the database efficiently? Edit: All the databases are MS SQL Server 2008 R2 Thanks in advance CM

    Read the article

< Previous Page | 494 495 496 497 498 499 500 501 502 503 504 505  | Next Page >