Search Results

Search found 15456 results on 619 pages for 'global temporary tables'.

Page 118/619 | < Previous Page | 114 115 116 117 118 119 120 121 122 123 124 125  | Next Page >

  • Can I move the Flash temp folder/buffer (Win64)

    - by xciter
    I need to move the folder in which the Flash plugin saves its temporary files (so NOT the browser cache folder). Normally, the plugin saves its buffered content in the default temporary folder of the operating system. However, I do not want to move that folder. In other words, I only need to change the folder which flash uses. My operating system is Windows 7 64-bit. I am using the latest version of the flash plugin.

    Read the article

  • Multidimensional Thinking–24 Hours of Pass: Celebrating Women in Technology

    - by smisner
    It’s Day 1 of #24HOP and it’s been great to participate in this event with so many women from all over the world in one long training-fest. The SQL community has been abuzz on Twitter with running commentary which is fun to watch while listening to the current speaker. If you missed the fun today because you’re busy with all that work you’ve got to do – don’t despair. All sessions are recorded and will be available soon. Keep an eye on the 24 Hours of Pass page for details. And the fun’s not over today. Rather than run 24 hours consecutively, #24HOP is now broken down into 12-hours over two days, so check out the schedule to see if there’s a session that interests you and fits your schedule. I’m pleased to announce that my business colleague Erika Bakse ( Blog | Twitter) will be presenting on Day 2 – her debut presentation for a PASS event. (And I’m also pleased to say she’s my daughter!) Multidimensional Thinking: The Presentation My contribution to this lineup of terrific speakers was Multidimensional Thinking. Here’s the abstract: “Whether you’re developing Analysis Services cubes or creating PowerPivot workbooks, you need to get into a multidimensional frame of mind to produce a model that best enables users to answer their business questions on their own. Many database professionals struggle initially with multidimensional models because the data modeling process is much different than the one they use to produce traditional, third normal form databases. In this session, I’ll introduce you to the terminology of multidimensional modeling and step through the process of translating business requirements into a viable model.” If you watched the presentation and want a copy of the slides, you can download a copy here. And you’re welcome to download the slides even if you didn’t watch the presentation, but they’ll make more sense if you did! Kimball All the Way There’s only so much I can cover in the time allotted, but I hope that I succeeded in my attempt to build a foundation that prepares you for starting out in business intelligence. One of my favorite resources that will get into much more detail about all kinds of scenarios (well beyond the basics!) is The Data Warehouse Toolkit (Second Edition) by Ralph Kimball. Anything from Kimball or the Kimball Group is worth reading. Kimball material might take reading and re-reading a few times before it makes sense. From my own experience, I found that I actually had to just build my first data warehouse using dimensional modeling on faith that I was going the right direction because it just didn’t click with me initially. I’ve had years of practice since then and I can say it does get easier with practice. The most important thing, in my opinion, is that you simply must prototype a lot and solicit user feedback, because ultimately the model needs to make sense to them. They will definitely make sure you get it right! Schema Generation One question came up after the presentation about whether we use SQL Server Management Studio or Business Intelligence Development Studio (BIDS) to build the tables for the dimensional model. My answer? It really doesn’t matter how you create the tables. Use whatever method that you’re comfortable with. But just so happens that it IS possible to set up your design in BIDS as part of an Analysis Services project and to have BIDS generate the relational schema for you. I did a Webcast last year called Building a Data Mart with Integration Services that demonstrated how to do this. Yes, the subject was Integration Services, but as part of that presentation, I showed how to leverage Analysis Services to build the tables, and then I showed how to use Integration Services to load those tables. I blogged about this presentation in September 2010 and included downloads of the project that I used. In the blog post, I explained that I missed a step in the demonstration. Oops. Just as an FYI, there were two more Webcasts to finish the story begun with the data – Accelerating Answers with Analysis Services and Delivering Information with Reporting Services. If you want to just cut to the chase and learn how to use Analysis Services to build the tables, you can see the Using the Schema Generation Wizard topic in Books Online.

    Read the article

  • Where opera store cookies set by expire after restart (time 0)

    - by marc
    Welcome, I'm looking for information where opera store temporary cookies with time expire (0), what's mean - cookie should be deleted after restart browser. Do opera create temporary file for these cookies and delete after restart (if yes what file), or store them in memory (RAM). I'm tested file "cookies4.dat" by hex-editor and it don't have that "temp" 0 cookies stored inside. regards

    Read the article

  • SQL SERVER – Curious Case of Disappearing Rows – ON UPDATE CASCADE and ON DELETE CASCADE – T-SQL Example – Part 2 of 2

    - by pinaldave
    Yesterday I wrote a real world story of how a friend who thought they have an issue with intrusion or virus whereas the issue was really in the code. I strongly suggest you read my earlier blog post Curious Case of Disappearing Rows – ON UPDATE CASCADE and ON DELETE CASCADE – Part 1 of 2 before continuing this blog post as this is second part of the first blog post. Let me reproduce the simple scenario in T-SQL. Building Sample Data USE [TestDB] GO -- Creating Table Products CREATE TABLE [dbo].[Products]( [ProductID] [int] NOT NULL, [ProductDesc] [varchar](50) NOT NULL, CONSTRAINT [PK_Products] PRIMARY KEY CLUSTERED ( [ProductID] ASC )) ON [PRIMARY] GO -- Creating Table ProductDetails CREATE TABLE [dbo].[ProductDetails]( [ProductDetailID] [int] NOT NULL, [ProductID] [int] NOT NULL, [Total] [int] NOT NULL, CONSTRAINT [PK_ProductDetails] PRIMARY KEY CLUSTERED ( [ProductDetailID] ASC )) ON [PRIMARY] GO ALTER TABLE [dbo].[ProductDetails] WITH CHECK ADD CONSTRAINT [FK_ProductDetails_Products] FOREIGN KEY([ProductID]) REFERENCES [dbo].[Products] ([ProductID]) ON UPDATE CASCADE ON DELETE CASCADE GO -- Insert Data into Table USE TestDB GO INSERT INTO Products (ProductID, ProductDesc) SELECT 1, 'Bike' UNION ALL SELECT 2, 'Car' UNION ALL SELECT 3, 'Books' GO INSERT INTO ProductDetails ([ProductDetailID],[ProductID],[Total]) SELECT 1, 1, 200 UNION ALL SELECT 2, 1, 100 UNION ALL SELECT 3, 1, 111 UNION ALL SELECT 4, 2, 200 UNION ALL SELECT 5, 3, 100 UNION ALL SELECT 6, 3, 100 UNION ALL SELECT 7, 3, 200 GO Select Data from Tables -- Selecting Data SELECT * FROM Products SELECT * FROM ProductDetails GO Delete Data from Products Table -- Deleting Data DELETE FROM Products WHERE ProductID = 1 GO Select Data from Tables Again -- Selecting Data SELECT * FROM Products SELECT * FROM ProductDetails GO Clean up Data -- Clean up DROP TABLE ProductDetails DROP TABLE Products GO My friend was confused as there was no delete was firing over ProductsDetails Table still there was a delete happening. The reason was because there is a foreign key created between Products and ProductsDetails Table with the keywords ON DELETE CASCADE. Due to ON DELETE CASCADE whenever is specified when the data from Table A is deleted and if it is referenced in another table using foreign key it will be deleted as well. Workaround 1: Design Changes – 3 Tables Change the design to have more than two tables. Create One Product Mater Table with all the products. It should historically store all the products list in it. No products should be ever removed from it. Add another table called Current Product and it should contain only the table which should be visible in the product catalogue. Another table should be called as ProductHistory table. There should be no use of CASCADE keyword among them. Workaround 2: Design Changes - Column IsVisible You can keep the same two tables. 1) Products and 2) ProductsDetails. Add a column with BIT datatype to it and name it as a IsVisible. Now change your application code to display the catalogue based on this column. There should be no need to delete anything. Workaround 3: Bad Advices (Bad advises begins here) The reason I have said bad advices because these are going to be bad advices for sure. You should make necessary design changes and not use poor workarounds which can damage the system and database integrity further. Here are the examples 1) Do not delete the data – well, this is not a real solution but can give time to implement design changes. 2) Do not have ON CASCADE DELETE – in this case, you will have entry in productsdetails which will have no corresponding product id and later on there will be lots of confusion. 3) Duplicate Data – you can have all the data of the product table move to the product details table and repeat them at each row. Now remove CASCADE code. This will let you delete the product table rows without any issue. There are so many things wrong this suggestion, that I will not even start here. (Bad advises ends here)  Well, did I miss anything? Please help me with your suggestions. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • alternative to download them all extension for firefox

    - by Nrew
    Do you know of any good alternative for the firefox extension download them all. Because when I try to download the second time(after the first has been downloaded) in megaupload. There would be a temporary error, which is not really temporary. Because it will last until you clean the cache/history.

    Read the article

  • Entity Framework with MySQL - Timeout Expired while Generating Model

    - by Nathan Taylor
    I've constructed a database in MySQL and I am attempting to map it out with Entity Framework, but I start running into "GenerateSSDLException"s whenever I try to add more than about 20 tables to the EF context. An exception of type 'Microsoft.Data.Entity.Design.VisualStudio.ModelWizard.Engine.ModelBuilderEngine+GenerateSSDLException' occurred while attempting to update from the database. The exception message is: 'An error occurred while executing the command definition. See the inner exception for details.' Fatal error encountered during command execution. Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding. There's nothing special about the affected tables, and it's never the same table(s), it's just that after a certain (unspecific) number of tables have been added, the context can no longer be updated without the "Timeout expired" error. Sometimes it's only one table left over, and sometimes it's three; results are pretty unpredictable. Furthermore, the variance in the number of tables which can be added before the error indicates to me that perhaps the problem lies in the size of the query being generated to update the context which includes both the existing table definitions, and also the new tables that are being added to it. Essentially, the SQL query is getting too large and it's failing to execute for some reason. If I generate the model with EdmGen2 it works without any errors, but the generated EDMX file cannot be updated within Visual Studio without producing the aforementioned exception. In all likelihood the source of this problem lies in the tool within Visual Studio given that EdmGen2 works fine, but I'm hoping that perhaps others could offer some advice on how to approach this very unique issue, because it seems like I'm not the only person experiencing it. One suggestion a colleague offered was maintaining two separate EBMX files with some table crossover, but that seems like a pretty ugly fix in my opinion. I suppose this is what I get for trying to use "new technology". :(

    Read the article

  • xl2tpd[845]: parse_config: line 13: data 'ipsec sared=yes' occurs with no context

    - by mmc18
    When I executed xl2tpd I amhaving following error. # xl2tpd -D xl2tpd[845]: parse_config: line 13: data 'ipsec sared=yes' occurs with no context xl2tpd[845]: init: Unable to load config file When I remove the "line 13" I having same error with "Line 14" thefore I do not think that the problem is about "ipsec sared" Here is my configuration file xl2tpd.conf. LINUX Ubuntu 12.0.4 ;Openswan IPsec 2.6.37; xl2tpd version: xl2tpd-1.3.1 ; [global] ipsec sared=yes listen-addr=47.168.137.27 ; [lns default] ip range = 192.168.1.10-192.168.1.20 local ip = 192.168.1.1 require chap = yes refuse pap = yes require authentication = yes ppp debug = yes pppoptfile = /etc/ppp/options.xl2tpd length bit = yes name=LinuxIPSECVPN ANSWER:(since have not enough reputation I am writting it over here.) removing the ";" character at the beginning of [global] and [lns default] have solved the issue. At fist I tought that [global] and[lns default] were just a comment.

    Read the article

  • mysql server upgrade problem from 5.0 to 5.1

    - by Avinash
    Hi I have upgraded my mysql server from 5.0 to 5.1. But i am having a problem related to tables for InnoDB storage Engine. My default engine is InnoDB, So it is enabled in my server. But tables with InneDB engine are not displaying in phpmyadmin. Tables with MyISAM are displaying properly. and also i can't fire a query on the table with InnoDB Engine. Thanks Avinash

    Read the article

  • Restore Database from MSSql 2008 to MSSql 2005

    - by Nirmal
    Hello... I have created a set of tables (around 20) in MSSql 2008 and entered around 1000 records to appropriate tables. But the issue is that I want that same tables with all the entered data into MSSql 2005 (SQLEXPRESS). Obviously it won't work by taking a backup and restore it into MSSql 2005 as it won't support backward compatibility. Any suggestion would be appreciated.... Thanks in advance....

    Read the article

  • Recover Data Entered in MSSql 2008 to MSSql 2005

    - by Nirmal
    Hello... I have created a set of tables (around 20) in MSSql 2008 and entered around 1000 records to appropriate tables. But the issue is that I want that same tables with all the entered data into MSSql 2005 (SQLEXPRESS). Obviously it won't work by taking a backup and restore it into MSSql 2005 as it won't support backward compatibility. Any suggestion would be appreciated.... Thanks in advance....

    Read the article

  • Restore Database from SQL Server 2008 to SQL Server 2005

    - by Nirmal
    I have created a set of tables (around 20) in SQL Server 2008 and entered around 1000 records to appropriate tables. But the issue is that I want that same tables with all the entered data into SQL Server 2005 (SQLEXPRESS). Obviously it won't work by taking a backup and restore it into SQL Server 2005 as it won't support backward compatibility. Any suggestion would be appreciated....

    Read the article

  • ASP.NET Creating a Rich Repeater, DataBind wiping out custom added controls...

    - by tonyellard
    So...I had this clever idea that I'd create my own Repeater control that implements paging and sorting by inheriting from Repeater and extending it's capabilities. I found some information and bits and pieces on how to go about this and everything seemed ok... I created a WebControlLibrary to house my custom controls. Along with the enriched repeater, I created a composite control that would act as the "pager bar", having forward, back and page selection. My pager bar works 100% on it's own, properly firing a paged changed event when the user interacts with it. The rich repeater databinds without issue, but when the databind fires (when I call base.databind()), the control collection is cleared out and my pager bars are removed. This screws up the viewstate for the pager bars making them unable to fire their events properly or maintain their state. I've tried adding the controls back to the collection after base.databind() fires, but that doesn't solve the issue. I start to get very strange results including problems with altering the hierarchy of the control tree (resolved by adding [ViewStateModeById]). Before I go back to the drawing board and create a second composite control which contains a repeater and the pager bars (so that the repeater isn't responsible for the pager bars viewstate) are there any thoughts about how to resolve the issue? In the interest of share and share alike, the code for the repeater itself is below, the pagerbars aren't as significant as the issue is really the maintaining of state for any additional child controls. (forgive the roughness of some of the code...it's still a work in progress) using System; using System.Collections.Generic; using System.ComponentModel; using System.Text; using System.Data; using System.Web; using System.Web.UI; using System.Web.UI.WebControls; [ViewStateModeById] public class SortablePagedRepeater : Repeater, INamingContainer { private SuperRepeaterPagerBar topBar = new SuperRepeaterPagerBar(); private SuperRepeaterPagerBar btmBar = new SuperRepeaterPagerBar(); protected override void OnInit(EventArgs e) { Page.RegisterRequiresControlState(this); InitializeControls(); base.OnInit(e); EnsureChildControls(); } protected void InitializeControls() { topBar.ID = this.ID + "__topPagerBar"; topBar.NumberOfPages = this._currentProperties.numOfPages; topBar.CurrentPage = this.CurrentPageNumber; topBar.PageChanged += new SuperRepeaterPagerBar.PageChangedEventHandler(PageChanged); btmBar.ID = this.ID + "__btmPagerBar"; btmBar.NumberOfPages = this._currentProperties.numOfPages; btmBar.CurrentPage = this.CurrentPageNumber; btmBar.PageChanged += new SuperRepeaterPagerBar.PageChangedEventHandler(PageChanged); } protected override void CreateChildControls() { EnsureDataBound(); this.Controls.Add(topBar); this.Controls.Add(btmBar); //base.CreateChildControls(); } private void PageChanged(object sender, int newPage) { this.CurrentPageNumber = newPage; } public override void DataBind() { //pageDataSource(); //DataBind removes all controls from control collection... base.DataBind(); Controls.Add(topBar); Controls.Add(btmBar); } private void pageDataSource() { //Create paged data source PagedDataSource pds = new PagedDataSource(); pds.PageSize = this.ItemsPerPage; pds.AllowPaging = true; // first get a PagedDataSource going and perform sort if possible... if (base.DataSource is System.Collections.IEnumerable) { pds.DataSource = (System.Collections.IEnumerable)base.DataSource; } else if (base.DataSource is System.Data.DataView) { DataView data = (DataView)DataSource; if (this.SortBy != null && data.Table.Columns.Contains(this.SortBy)) { data.Sort = this.SortBy; } pds.DataSource = data.Table.Rows; } else if (base.DataSource is System.Data.DataTable) { DataTable data = (DataTable)DataSource; if (this.SortBy != null && data.Columns.Contains(this.SortBy)) { data.DefaultView.Sort = this.SortBy; } pds.DataSource = data.DefaultView; } else if (base.DataSource is System.Data.DataSet) { DataSet data = (DataSet)DataSource; if (base.DataMember != null && data.Tables.Contains(base.DataMember)) { if (this.SortBy != null && data.Tables[base.DataMember].Columns.Contains(this.SortBy)) { data.Tables[base.DataMember].DefaultView.Sort = this.SortBy; } pds.DataSource = data.Tables[base.DataMember].DefaultView; } else if (data.Tables.Count > 0) { if (this.SortBy != null && data.Tables[0].Columns.Contains(this.SortBy)) { data.Tables[0].DefaultView.Sort = this.SortBy; } pds.DataSource = data.Tables[0].DefaultView; } else { throw new Exception("DataSet doesn't have any tables."); } } else if (base.DataSource == null) { // don't do anything? } else { throw new Exception("DataSource must be of type System.Collections.IEnumerable. The DataSource you provided is of type " + base.DataSource.GetType().ToString()); } if (pds != null && base.DataSource != null) { //Make sure that the page doesn't exceed the maximum number of pages //available if (this.CurrentPageNumber >= pds.PageCount) { this.CurrentPageNumber = pds.PageCount - 1; } //Set up paging values... btmBar.CurrentPage = topBar.CurrentPage = pds.CurrentPageIndex = this.CurrentPageNumber; this._currentProperties.numOfPages = btmBar.NumberOfPages = topBar.NumberOfPages = pds.PageCount; base.DataSource = pds; } } public override object DataSource { get { return base.DataSource; } set { //init(); //reset paging/sorting values since we've potentially changed data sources. base.DataSource = value; pageDataSource(); } } protected override void Render(HtmlTextWriter writer) { topBar.RenderControl(writer); base.Render(writer); btmBar.RenderControl(writer); } [Serializable] protected struct CurrentProperties { public int pageNum; public int itemsPerPage; public int numOfPages; public string sortBy; public bool sortDir; } protected CurrentProperties _currentProperties = new CurrentProperties(); protected override object SaveControlState() { return this._currentProperties; } protected override void LoadControlState(object savedState) { this._currentProperties = (CurrentProperties)savedState; } [Category("Status")] [Browsable(true)] [NotifyParentProperty(true)] [DefaultValue("")] [Localizable(false)] public string SortBy { get { return this._currentProperties.sortBy; } set { //If sorting by the same column, swap the sort direction. if (this._currentProperties.sortBy == value) { this.SortAscending = !this.SortAscending; } else { this.SortAscending = true; } this._currentProperties.sortBy = value; } } [Category("Status")] [Browsable(true)] [NotifyParentProperty(true)] [DefaultValue(true)] [Localizable(false)] public bool SortAscending { get { return this._currentProperties.sortDir; } set { this._currentProperties.sortDir = value; } } [Category("Status")] [Browsable(true)] [NotifyParentProperty(true)] [DefaultValue(25)] [Localizable(false)] public int ItemsPerPage { get { return this._currentProperties.itemsPerPage; } set { this._currentProperties.itemsPerPage = value; } } [Category("Status")] [Browsable(true)] [NotifyParentProperty(true)] [DefaultValue(1)] [Localizable(false)] public int CurrentPageNumber { get { return this._currentProperties.pageNum; } set { this._currentProperties.pageNum = value; pageDataSource(); } } }

    Read the article

  • Database design for very large amount of data

    - by Hossein
    Hi, I am working on a project, involving large amount of data from the delicious website.The data available is at files are "Date,UserId,Url,Tags" (for each bookmark). I normalized my database to a 3NF, and because of the nature of the queries that we wanted to use In combination I came down to 6 tables....The design looks fine, however, now a large amount of data is in the database, most of the queries needs to "join" at least 2 tables together to get the answer, sometimes 3 or 4. At first, we didn't have any performance issues, because for testing matters we haven't had added too much data in the database. No that we have a lot of data, simply joining extremely large tables does take a lot of time and for our project which has to be real-time is a disaster.I was wondering how big companies solve these issues.Looks like normalizing tables just adds complexity, but how does the big company handle large amounts of data in their databases, don't they do the normalization? thanks

    Read the article

  • Export ASPNETDB data to another Database.

    - by raziiq
    Hi there. I am developing in Visual Web developer 2008. I have SQLEXPRESS 2005 and SQL Management Studio 2008 installed on my PC. I purchased a Database MS SQL 2008 on DiscountASP.net. Since the host provides only 1 database and my project has 2 database. One is the ASPNETDB that contains the roles and user etc (created using the Website Configuration Wizard) and the other is my database containing data to my website and is named MainDB. As Host allows only 1 database so i exported my ASPNETDB's tables and stored procedures to my MainDB using aspnet_regsql.exe, but the problem is that stored procedures and tables are exported to my MainDB but data is not exported, i mean there are no users in the tables. My Question is how to export everything of ASPNETDB including stored procedures, tables and data to my MainDB??

    Read the article

  • algorithm to combinatorics

    - by peiska
    I am trying to solve a combinatorics problem, it seems easy, but i am having some trouble with it. If i have at most X tables, and N persons to sit on the tables, Each table can have 1 to N seating places, and I can only sit persons in one side of a rectangular table( so the order how people sit matters). I want to make a code that can calculate all the distributions of seating places from 1 up to K tables. For example, if I have 12 persons and 1 table i have 479001600 ways of seating persons( thats easy to calculate I've used Factorial of 12). But if I have 12 persons and 3 tables i have 4390848000 ways of seating persons. I've tried different solutions but i was not able to find the correct one. I've tried to divided the 12 in 3, then o use factorial of the result (it didnt work), i've tried to use 12! * 3( it didn't work too). Can some one give me a tip in a algorithm that i can use?

    Read the article

  • doctrine findby relation

    - by iggnition
    I'm having trouble selecting a subset of data with doctrine. I have 3 tables Location Contact Contact_location The contact and location tables hold a name and an id the other table holds only ID's. For instance: Location loc_id: 1 name: detroit Contact contact_id: 1 name: Mike Contact_location loc_id: 1 contact_id: 1 In doctrine there is a many to many relation between the location and contact tables with contact_location as the ref_class. What i want to do is on my location page i want to find all contacts where for instance the loc_id = 1. I tried: $this->installedbases = Doctrine::getTable('contact')->findByloc_id(1); hoping doctrine would see the relation and get it, but it does not. How can i make doctrine search in relevant related tables? I read it can be done using Findby but i find the documentation unclear.

    Read the article

  • Designing a game database

    - by Ronald
    I'm trying to design a database to record game histories for a game I'm working on. I have 3 different tables: users, gamehistories, and gamescores. Columns for tables: users: uid, displayname, email gamehistories: gid, pubID, start (datetime), end (datetime) gamescores: gid, uid, score I am trying to produce the following result set given a userID (uid): Opponent's displayname, my score, opponent's score, duration Any ideas? Is my design ok? How can I query these tables to get game histories for a given uid?

    Read the article

  • PHP, MySQL Per row output

    - by Jake
    Been all over the place and spent 8 hours working loops and math...still cant get it... I am writing a CP for a customer. The input form will allow them to add a package deal to their web page. Each package deal is stored in mysql in the following format id header item1 item2 item3...item12 price savings The output is a table with 'header' being the name of the package and each item being 1 of the items in the package stored in a row within the table, then the price and amount saved (all of this is input via a form and INSERT statement by the customer, that part works) stored rows as well. Here is my PHP: include 'dbconnect.php'; $query = "SELECT * FROM md "; $result = mysql_query($query); while ($row = mysql_fetch_array($result)) { //Print 4 tables with header, items, price, savings //Go to next row (on main page) and print 4 more items //Do this until all tables have been printed } mysql_free_result($result); mysql_close($conn); ? Right now the main page is simpley a header div, main div, content wrap and footer div. Ideal HTML output of a 'package' table, not perfect and a little confusing $header $item1 $item2 ect... $price$savings I basically want the PHP to print this 4 times per row on the main page and then move to the next row and print 4 times and continue until there are no more items in the array. So it could be row 1 4 tables, row 2 4 tables, row 3 3 tables. Long, confusing and frustrating. I about to just do 1 item per row..please help Jake

    Read the article

  • SQL Server 2005 user permissions

    - by karl
    I have created a database and some dbo.tables. Now I want to create a user that are can read and write to these tables, but not modify or drop. However I want this user to be able to create own tables and let him do what he want with these. Is this possible? Could someone explain how this can be done?

    Read the article

  • How do I retrieve a list of base class objects without joins using NHibernate ICriteria?

    - by Kristoffer
    Let's say I have a base class called Pet and two subclasses Cat and Dog that inherit Pet. I simply map these to three tables Pet, Cat and Dog, where the Pet table contains the base class properties and the Cat and Dog tables contain a foreign key to the Pet table and any additional properties specific to a cat or dog. A joined subclass strategy. Now, using NHibernate and ICriteria, how can I get a list of "pure" Pet objects (not cats or dogs, just pets), without making any joins to the other tables?

    Read the article

  • How to fill dataset when sql is returning more than one table

    - by Shantanu Gupta
    How to fill multiple tables in a dataset. I m using a query that returns me four tables. At the frontend I am trying to fill all the four resultant table into dataset. Here is my Query. Query is not complete. But it is just a refrence for my Ques Select * from tblxyz compute sum(col1) suppose this query returns more than one table, I want to fill all the tables into my dataset I am filling result like this con.open(); adp.fill(dset); con.close(); Now when i checks this dataset. It shows me that it has four tables but only first table data is being displayed into it. rest 3 dont even have schema also. What i need to do to get desired output

    Read the article

  • SQL Server Table locks in long query - Solution: NoLock?

    - by Kovu
    a report in my application runs a query that needs between 5 - 15 seconds (constrained to count of rows that will be returned). The query has 8 joins to nearly all main-tables of my application (Customers, sales, units etc). A little tool shows me, that in this time, all those 8 tables are locked with a shared table lock. That means, no update operation will be done in this time. A solution from a friend is, to have every join in the query, which is not mandetory to have 100% correct data (dirty read), with a NoLock, so only 1 of this 8 tables will be locked completly. Is that a good solution? For a report in which 99% of data came from one table, unlock the less prio tables?

    Read the article

  • Default Value or Binding in "Transfer SQL Server Object Task"

    - by Kronass
    Hi, I want to move 500 table from Database to other with their data and constraints all the tables have column who has default value, I used SSIS using "Transfer SQL Server Object Task" and I choose to copy all tables, copy data and primary keys, it copies the table except the default bindings I tried in SQL Server 2008 CopyAllDRIObjects Property but still the same result. How can I copy all tables from database to other with their data and maintaining their constraints.

    Read the article

  • Trying to verify understanding of foreign keys SQL Server

    - by msarchet
    So I'm working on just a learning project to expose myself to doing some things I do not get to do at work. I'm just making a simple bug and case tracking app (I know there are a million this is just to work with some tools I don't get to). So I was designing my database and realized I've never actually used Foreign Keys before in any of my projects, I've used them before but never actually setting up a column as a FK. So I've designed my database as follows, which I think is close to correct (at least for the initial layout). However When I try to add the FK's to the linking Tables I get an error saying, "The tables present in the relationship must have the same number of columns". I'm doing this by in SQLSMS by going to the Keys 'folder' and adding a FK. Is there something that I am doing wrong here, I don't understand why the tables would have to have the same number of columns for me to add a FK relationship between the tables?

    Read the article

  • makefile pattern rules: single wildcard, multiple instances in prerequisite

    - by johndashen
    Hi all, hopefully this is a basic question about make pattern rules: I want to use a wildcard more than once in a prerequisite for a rule, i.e. in my Makefile I have data/%P1.m: $(PROJHOME)/data/%/ISCAN/%P1.RAW @echo " Writing temporary matlab file for $*" # do something data/%P2.m: $(PROJHOME)/data/%/ISCAN/AGP2.RAW @echo " Writing temporary matlab file for $*" # do something In this example, I try to invoke make when the wildcard % is AG. Both files $(PROJHOME)/data/AG/ISCAN/AGP1.RAW and $(PROJHOME)/data/AG/ISCAN/AGP2.RAW exist. I attempt the following make commands and get this output: [jshen@iLab10 gender-diffs]$ make data/AGP1.m make: *** No rule to make target `data/AGP1.m'. Stop. [jshen@iLab10 gender-diffs]$ make data/AGP2.m Writing temporary matlab file for AG, part 2... [jshen@iLab10 gender-diffs]$ ls data/AG/ISCAN/AG* data/AG/ISCAN/AGP1.RAW data/AG/ISCAN/AGP2.RAW How can I implement multiple instances of the same wildcard in the first make rule?

    Read the article

< Previous Page | 114 115 116 117 118 119 120 121 122 123 124 125  | Next Page >