Search Results

Search found 32120 results on 1285 pages for 'django multi table inheri'.

Page 103/1285 | < Previous Page | 99 100 101 102 103 104 105 106 107 108 109 110  | Next Page >

  • what mysql table structure is better

    - by Sergey
    I have very complicated search algorithm on my site, so i decided to make a table with cache or maybe all possible results. I wanna ask what structure would be better, or maybe not the one of them? (mySQL) 1) word VARCHAR, results TEXT or BLOB where i'll store ids of found objects (for example 6 chars for each id) 2) word VARCHAR, result INT, but words are not unique now i think i'll have about 200 000 rows in 1) with 1000-10000 ids each row or 200 000 000+ rows in 2) First way takes more storage memory but i think it would be much faster to find 1 unique row among 200 000, than 1000 rows among 200 mln non unique rows i think about index on word column and no sphinx. So that do YOU think? p.s. as always, sorry for my english if it's not very good.

    Read the article

  • Lua not producing table of functions (IO API)

    - by ArtOfCode
    I'm working on a basic project in Lua. I've been trying to get data from files using the IO API (as defined here), but when I open a file and give it a handle, it doesn't seem to return a table of functions. The (erroneous bit of) code: local unread = fs.list("email/"..from.."/") local send = "" for _,file in ipairs(unread) do local handle = io.open(file,"r") local text = handle:read("*a") send = send .. text .. "\n" handle.close() fs.delete(file) end The fs you see on the first line is a professional filesystem wrapper round the IO API, not my work and perfectly error-free, so that's not the problem. However, when I try to read the file (handle:read()), it throws "attempt to index nil". Tracing it, it turns out that handle itself is nil. Any ideas?

    Read the article

  • 100% height table resets scroll offset

    - by koko
    Hi, this is more like a question of principle. I made a table with 100% width and height to make 3 rows nice and auto-resizable (welcome to xhtml :D). When I begin to toggle() some elements, the total size of the page changes, and my browser resets its scroll offset and scrolls all the way to the top of the page. Is there some way to prevent scrolling, except making a JS function to calculate the scroll offset and make it jump to its previous offset? I don´t want to mess around with 3 divs, trying to align them automatically in their height.

    Read the article

  • Select those objects whose related objects IDs are *all* in given string

    - by Jannis
    Hi Django people, I want to build a frontend to a recipe database which enables the user to search for a list of recipes which are cookable with the ingredients the user supplies. I have the following models class Ingredient(models.Model): name = models.CharField(max_length=100, unique=True) slug = models.SlugField(max_length=100, unique=True) importancy = models.PositiveSmallIntegerField(default=4) […] class Amount(models.Model): recipe = models.ForeignKey('Recipe') ingredient = models.ForeignKey(Ingredient) […] class Rezept(models.Model): name = models.CharField(max_length=100) slug = models.SlugField() instructions = models.TextField() ingredients = models.ManyToManyField(Ingredient, through=Amount) […] and a rawquery which does exactly what I want: It gets all the recipes whose required ingredients are all contained in the list of strings that the user supplies. If he supplies more than necessary, it's fine too. query = "SELECT *, COUNT(amount.zutat_id) AS selected_count_ingredients, (SELECT COUNT(*) FROM amount WHERE amount.recipe_id = amount.id) AS count_ingredients FROM amount LEFT OUTER JOIN amount ON (recipe.id = recipe.recipe_id) WHERE amount.ingredient_id IN (%s) GROUP BY amount.id HAVING count_ingredient=selected_count_ingredient" % ",".join([str(ingredient.id) for ingredient in ingredients]) rezepte = Rezept.objects.raw(query) Now, what I'm looking for is a way that does not rely on .raw() as I would like to do it purely with Django's queryset methods. Additionally, it would be awesome if you guys knew a way of including the ingredient's importancy in the lookup so that a recipe is still shown as a result even though one of its ingredients (that has an importancy of 0) is not supplied by the user.

    Read the article

  • Dynamically setting the queryset of a ModelMultipleChoiceField to a custom recordset

    - by Daniel Quinn
    I've seen all the howtos about how you can set a ModelMultipleChoiceField to use a custom queryset and I've tried them and they work. However, they all use the same paradigm: the queryset is just a filtered list of the same objects. In my case, I'm trying to get the admin to draw a multiselect form that instead of using usernames as the text portion of the , I'd like to use the name field from my account class. Here's a breakdown of what I've got: # models.py class Account(models.Model): name = models.CharField(max_length=128,help_text="A display name that people understand") user = models.ForeignKey(User, unique=True) # Tied to the User class in settings.py class Organisation(models.Model): administrators = models.ManyToManyField(User) # admin.py from django.forms import ModelMultipleChoiceField from django.contrib.auth.models import User class OrganisationAdminForm(forms.ModelForm): def __init__(self, *args, **kwargs): from ethico.accounts.models import Account self.base_fields["administrators"] = ModelMultipleChoiceField( queryset=User.objects.all(), required=False ) super(OrganisationAdminForm, self).__init__(*args, **kwargs) class Meta: model = Organisation This works, however, I want queryset above to draw a selectbox with the Account.name property and the User.id property. This didn't work: queryset=Account.objects.all().order_by("name").values_list("user","name") It failed with this error: 'tuple' object has no attribute 'pk' I figured that this would be easy, but it's turned into hours of dead-ends. Anyone care to shed some light?

    Read the article

  • Blocking error in celery

    - by dmitry
    I have no idea what's this. Python 2.7 + django-1.5.1 + httpd + rabbitmq + django-celery==3.0.17 Tasks are not executed because of some error. Below is celery's log. Maybe someone has faced it before. [2013-06-24 17:10:03,792: CRITICAL/MainProcess] Can't decode message body: AttributeError("'JoinInfo' object has no attribute '__dict__'",) (type:u'application/x-python-serialize' encoding:u'binary' raw:'\'\\x80\\x02}q\\x01(U\\x07expiresq\\x02NU\\x03utcq\\x03\\x88U\\x04argsq\\x04cdjango.contrib.auth.models\\nUser\\nq\\x05)\\x81q\\x06}q\\x07(U\\x08usernameq\\x08X\\x19\\x00\\x00\\[email protected]\\nfirst_nameq\\tX\\x05\\x00\\x00\\x00BibbyU\\tlast_nameq\\nX\\x08\\x00\\x00\\x00OffshoreU\\r_client_cacheq\\x0bccopy_reg\\n_reconstructor\\nq\\x0ccbongoregistration.models\\nClient\\nq\\rc__builtin__\\nobject\\nq\\x0eN\\x87Rq\\x0f}q\\x10(h\\nX\\x08\\x00\\x00\\x00OffshoreU\\x1bpurchase_confirmation_emailq\\x11X\\x1f\\x00\\x00\\[email protected]\\x1dpurchase_confirmation_email_1q\\x12X!\\x00\\x00\\[email protected]\\x06_stateq\\x13cdjango.db.models.base\\nModelState\\nq\\x14)\\x81q\\x15}q\\x16(U\\x06addingq\\x17\\x89U\\x02dbq\\x18U\\x07defaultq\\x19ubU\\x0buser_ptr_idq\\x1aJ\\xb4\\xa2\\x03\\x00U\\x08is_staffq\\x1b\\x89U\\x08postcodeq\\x1cX\\x08\\x00\\x00\\x00AB11 5BSU\\x0cdegree_limitq\\x1dK\\x06U\\x07messageq\\x1eX\\xd1E\\x00\\x00<table id="container" style="margin: 0px; padding: 0px; width: 100%; background-color: #ffffff;" cellspacing="0" cellpadding="0"... (22911b)'') Traceback (most recent call last): File "/opt/www/MyProject-main/eggs/kombu-2.5.10-py2.7.egg/kombu/messaging.py", line 556, in _receive_callback decoded = None if on_m else message.decode() File "/opt/www/MyProject-main/eggs/kombu-2.5.10-py2.7.egg/kombu/transport/base.py", line 147, in decode self.content_encoding, accept=self.accept) File "/opt/www/MyProject-main/eggs/kombu-2.5.10-py2.7.egg/kombu/serialization.py", line 187, in decode return decode(data) File "/opt/www/MyProject-main/eggs/kombu-2.5.10-py2.7.egg/kombu/serialization.py", line 74, in pickle_loads return load(BytesIO(s)) AttributeError: 'JoinInfo' object has no attribute '__dict__'

    Read the article

  • Firefox handles xxx.submit(), Safari doesn't ... what can be done?

    - by Prairiedogg
    I'm trying to make a pull down menu post a form when the user selects (releases the mouse) on one of the options from the menu. This code works fine in FF but Safari, for some reason, doesn't submit the form. I re-wrote the code using jquery to see if jquery's .submit() implementation handled the browser quirks better. Same result, works in FF doesn't work in safari. The following snippets are from the same page, which has some django template language mixed in. Here's the vanilla js attempt: function formSubmit(lang) { if (lang != '{{ LANGUAGE_CODE }}') { document.getElementById("setlang_form").submit(); } } Here's the jquery attempt: $(document).ready(function() { $('#lang_submit').hide() $('#setlang_form option').mouseup(function () { if ($(this).attr('value') != '{{ LANGUAGE_CODE }}') { $('#setlang_form').submit() } }); }); and here's the form: <form id="setlang_form" method="post" action="{% url django.views.i18n.set_language %}"> <fieldset> <select name="language"> {% for lang in interface_languages %} <option value="{{ lang.code }}" onmouseup="formSubmit('{{ lang.name }}')" {% ifequal lang.code LANGUAGE_CODE %}selected="selected"{% endifequal %}>{{ lang.name }}</option> {% endfor %} </select> </fieldset> </form> My question is, how can I get this working in Safari?

    Read the article

  • JDeveloper 11.1.2 : Command Link in Table Column Work Around

    - by Frank Nimphius
    Just figured that in Oracle JDeveloper 11.1.2, clicking on a command link in a table does not mark the table row as selected as it is the behavior in previous releases of Oracle JDeveloper. For the time being, the following work around can be used to achieve the "old" behavior: To mark the table row as selected, you need to build and queue the table selection event in the code executed by the command link action listener. To queue a selection event, you need to know about the rowKey of the row that the command link that you clicked on is located in. To get to this information, you add an f:attribute tag to the command link as shown below <af:column sortProperty="#{bindings.DepartmentsView1.hints.DepartmentId.name}" sortable="false"    headerText="#{bindings.DepartmentsView1.hints.DepartmentId.label}" id="c1">   <af:commandLink text="#{row.DepartmentId}" id="cl1" partialSubmit="true"       actionListener="#{BrowseBean.onCommandItemSelected}">     <f:attribute name="rowKey" value="#{row.rowKey}"/>   </af:commandLink>   ... </af:column> The f:attribute tag references #{row.rowKey} wich in ADF translates to JUCtrlHierNodeBinding.getRowKey(). This information can be used in the command link action listener to compose the RowKeySet you need to queue the selected row. For simplicitly reasons, I created a table "binding" reference to the managed bean that executes the command link action. The managed bean code that is referenced from the af:commandLink actionListener property is shown next: public void onCommandItemSelected(ActionEvent actionEvent) {   //get access to the clicked command link   RichCommandLink comp = (RichCommandLink)actionEvent.getComponent();   //read the added f:attribute value   Key rowKey = (Key) comp.getAttributes().get("rowKey");     //get the current selected RowKeySet from the table   RowKeySet oldSelection = table.getSelectedRowKeys();   //build an empty RowKeySet for the new selection   RowKeySetImpl newSelection = new RowKeySetImpl();     //RowKeySets contain List objects with key objects in them   ArrayList list = new ArrayList();   list.add(rowKey);   newSelection.add(list);     //create the selectionEvent and queue it   SelectionEvent selectionEvent = new SelectionEvent(oldSelection, newSelection, table);   selectionEvent.queue();     //refresh the table   AdfFacesContext.getCurrentInstance().addPartialTarget(table); }

    Read the article

  • SQL Server script commands to check if object exists and drop it

    - by deadlydog
    Over the past couple years I’ve been keeping track of common SQL Server script commands that I use so I don’t have to constantly Google them.  Most of them are how to check if a SQL object exists before dropping it.  I thought others might find these useful to have them all in one place, so here you go: 1: --=============================== 2: -- Create a new table and add keys and constraints 3: --=============================== 4: IF NOT EXISTS (SELECT * FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_NAME = 'TableName' AND TABLE_SCHEMA='dbo') 5: BEGIN 6: CREATE TABLE [dbo].[TableName] 7: ( 8: [ColumnName1] INT NOT NULL, -- To have a field auto-increment add IDENTITY(1,1) 9: [ColumnName2] INT NULL, 10: [ColumnName3] VARCHAR(30) NOT NULL DEFAULT('') 11: ) 12: 13: -- Add the table's primary key 14: ALTER TABLE [dbo].[TableName] ADD CONSTRAINT [PK_TableName] PRIMARY KEY NONCLUSTERED 15: ( 16: [ColumnName1], 17: [ColumnName2] 18: ) 19: 20: -- Add a foreign key constraint 21: ALTER TABLE [dbo].[TableName] WITH CHECK ADD CONSTRAINT [FK_Name] FOREIGN KEY 22: ( 23: [ColumnName1], 24: [ColumnName2] 25: ) 26: REFERENCES [dbo].[Table2Name] 27: ( 28: [OtherColumnName1], 29: [OtherColumnName2] 30: ) 31: 32: -- Add indexes on columns that are often used for retrieval 33: CREATE INDEX IN_ColumnNames ON [dbo].[TableName] 34: ( 35: [ColumnName2], 36: [ColumnName3] 37: ) 38: 39: -- Add a check constraint 40: ALTER TABLE [dbo].[TableName] WITH CHECK ADD CONSTRAINT [CH_Name] CHECK (([ColumnName] >= 0.0000)) 41: END 42: 43: --=============================== 44: -- Add a new column to an existing table 45: --=============================== 46: IF NOT EXISTS (SELECT * FROM INFORMATION_SCHEMA.COLUMNS where TABLE_SCHEMA='dbo' 47: AND TABLE_NAME = 'TableName' AND COLUMN_NAME = 'ColumnName') 48: BEGIN 49: ALTER TABLE [dbo].[TableName] ADD [ColumnName] INT NOT NULL DEFAULT(0) 50: 51: -- Add a description extended property to the column to specify what its purpose is. 52: EXEC sys.sp_addextendedproperty @name=N'MS_Description', 53: @value = N'Add column comments here, describing what this column is for.' , 54: @level0type=N'SCHEMA',@level0name=N'dbo', @level1type=N'TABLE', 55: @level1name = N'TableName', @level2type=N'COLUMN', 56: @level2name = N'ColumnName' 57: END 58: 59: --=============================== 60: -- Drop a table 61: --=============================== 62: IF EXISTS (SELECT * FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_NAME = 'TableName' AND TABLE_SCHEMA='dbo') 63: BEGIN 64: DROP TABLE [dbo].[TableName] 65: END 66: 67: --=============================== 68: -- Drop a view 69: --=============================== 70: IF EXISTS (SELECT * FROM INFORMATION_SCHEMA.VIEWS WHERE TABLE_NAME = 'ViewName' AND TABLE_SCHEMA='dbo') 71: BEGIN 72: DROP VIEW [dbo].[ViewName] 73: END 74: 75: --=============================== 76: -- Drop a column 77: --=============================== 78: IF EXISTS (SELECT * FROM INFORMATION_SCHEMA.COLUMNS where TABLE_SCHEMA='dbo' 79: AND TABLE_NAME = 'TableName' AND COLUMN_NAME = 'ColumnName') 80: BEGIN 81: 82: -- If the column has an extended property, drop it first. 83: IF EXISTS (SELECT * FROM sys.fn_listExtendedProperty(N'MS_Description', N'SCHEMA', N'dbo', N'Table', 84: N'TableName', N'COLUMN', N'ColumnName') 85: BEGIN 86: EXEC sys.sp_dropextendedproperty @name=N'MS_Description', 87: @level0type=N'SCHEMA',@level0name=N'dbo', @level1type=N'TABLE', 88: @level1name = N'TableName', @level2type=N'COLUMN', 89: @level2name = N'ColumnName' 90: END 91: 92: ALTER TABLE [dbo].[TableName] DROP COLUMN [ColumnName] 93: END 94: 95: --=============================== 96: -- Drop Primary key constraint 97: --=============================== 98: IF EXISTS (SELECT * FROM INFORMATION_SCHEMA.TABLE_CONSTRAINTS WHERE CONSTRAINT_TYPE='PRIMARY KEY' AND TABLE_SCHEMA='dbo' 99: AND TABLE_NAME = 'TableName' AND CONSTRAINT_NAME = 'PK_Name') 100: BEGIN 101: ALTER TABLE [dbo].[TableName] DROP CONSTRAINT [PK_Name] 102: END 103: 104: --=============================== 105: -- Drop Foreign key constraint 106: --=============================== 107: IF EXISTS (SELECT * FROM INFORMATION_SCHEMA.TABLE_CONSTRAINTS WHERE CONSTRAINT_TYPE='FOREIGN KEY' AND TABLE_SCHEMA='dbo' 108: AND TABLE_NAME = 'TableName' AND CONSTRAINT_NAME = 'FK_Name') 109: BEGIN 110: ALTER TABLE [dbo].[TableName] DROP CONSTRAINT [FK_Name] 111: END 112: 113: --=============================== 114: -- Drop Unique key constraint 115: --=============================== 116: IF EXISTS (SELECT * FROM INFORMATION_SCHEMA.TABLE_CONSTRAINTS WHERE CONSTRAINT_TYPE='UNIQUE' AND TABLE_SCHEMA='dbo' 117: AND TABLE_NAME = 'TableName' AND CONSTRAINT_NAME = 'UNI_Name') 118: BEGIN 119: ALTER TABLE [dbo].[TableNames] DROP CONSTRAINT [UNI_Name] 120: END 121: 122: --=============================== 123: -- Drop Check constraint 124: --=============================== 125: IF EXISTS (SELECT * FROM INFORMATION_SCHEMA.TABLE_CONSTRAINTS WHERE CONSTRAINT_TYPE='CHECK' AND TABLE_SCHEMA='dbo' 126: AND TABLE_NAME = 'TableName' AND CONSTRAINT_NAME = 'CH_Name') 127: BEGIN 128: ALTER TABLE [dbo].[TableName] DROP CONSTRAINT [CH_Name] 129: END 130: 131: --=============================== 132: -- Drop a column's Default value constraint 133: --=============================== 134: DECLARE @ConstraintName VARCHAR(100) 135: SET @ConstraintName = (SELECT TOP 1 s.name FROM sys.sysobjects s JOIN sys.syscolumns c ON s.parent_obj=c.id 136: WHERE s.xtype='d' AND c.cdefault=s.id 137: AND parent_obj = OBJECT_ID('TableName') AND c.name ='ColumnName') 138: 139: IF @ConstraintName IS NOT NULL 140: BEGIN 141: EXEC ('ALTER TABLE [dbo].[TableName] DROP CONSTRAINT ' + @ConstraintName) 142: END 143: 144: --=============================== 145: -- Example of how to drop dynamically named Unique constraint 146: --=============================== 147: DECLARE @ConstraintName VARCHAR(100) 148: SET @ConstraintName = (SELECT TOP 1 CONSTRAINT_NAME FROM INFORMATION_SCHEMA.TABLE_CONSTRAINTS 149: WHERE CONSTRAINT_TYPE='UNIQUE' AND TABLE_SCHEMA='dbo' 150: AND TABLE_NAME = 'TableName' AND CONSTRAINT_NAME LIKE 'FirstPartOfConstraintName%') 151: 152: IF EXISTS (SELECT * FROM INFORMATION_SCHEMA.TABLE_CONSTRAINTS WHERE CONSTRAINT_TYPE='UNIQUE' AND TABLE_SCHEMA='dbo' 153: AND TABLE_NAME = 'TableName' AND CONSTRAINT_NAME = @ConstraintName) 154: BEGIN 155: EXEC ('ALTER TABLE [dbo].[TableName] DROP CONSTRAINT ' + @ConstraintName) 156: END 157: 158: --=============================== 159: -- Check for and drop a temp table 160: --=============================== 161: IF OBJECT_ID('tempdb..#TableName') IS NOT NULL DROP TABLE #TableName 162: 163: --=============================== 164: -- Drop a stored procedure 165: --=============================== 166: IF EXISTS (SELECT * FROM INFORMATION_SCHEMA.ROUTINES WHERE ROUTINE_TYPE='PROCEDURE' AND ROUTINE_SCHEMA='dbo' AND 167: ROUTINE_NAME = 'StoredProcedureName') 168: BEGIN 169: DROP PROCEDURE [dbo].[StoredProcedureName] 170: END 171: 172: --=============================== 173: -- Drop a UDF 174: --=============================== 175: IF EXISTS (SELECT * FROM INFORMATION_SCHEMA.ROUTINES WHERE ROUTINE_TYPE='FUNCTION' AND ROUTINE_SCHEMA='dbo' AND 176: ROUTINE_NAME = 'UDFName') 177: BEGIN 178: DROP FUNCTION [dbo].[UDFName] 179: END 180: 181: --=============================== 182: -- Drop an Index 183: --=============================== 184: IF EXISTS (SELECT * FROM SYS.INDEXES WHERE name = 'IndexName') 185: BEGIN 186: DROP INDEX TableName.IndexName 187: END 188: 189: --=============================== 190: -- Drop a Schema 191: --=============================== 192: IF EXISTS (SELECT * FROM INFORMATION_SCHEMA.SCHEMATA WHERE SCHEMA_NAME = 'SchemaName') 193: BEGIN 194: EXEC('DROP SCHEMA SchemaName') 195: END And here’s the same code, just not in the little code view window so that you don’t have to scroll it.--=============================== -- Create a new table and add keys and constraints --=============================== IF NOT EXISTS (SELECT * FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_NAME = 'TableName' AND TABLE_SCHEMA='dbo') BEGIN CREATE TABLE [dbo].[TableName]  ( [ColumnName1] INT NOT NULL, -- To have a field auto-increment add IDENTITY(1,1) [ColumnName2] INT NULL, [ColumnName3] VARCHAR(30) NOT NULL DEFAULT('') ) -- Add the table's primary key ALTER TABLE [dbo].[TableName] ADD CONSTRAINT [PK_TableName] PRIMARY KEY NONCLUSTERED ( [ColumnName1],  [ColumnName2] ) -- Add a foreign key constraint ALTER TABLE [dbo].[TableName] WITH CHECK ADD CONSTRAINT [FK_Name] FOREIGN KEY ( [ColumnName1],  [ColumnName2] ) REFERENCES [dbo].[Table2Name]  ( [OtherColumnName1],  [OtherColumnName2] ) -- Add indexes on columns that are often used for retrieval CREATE INDEX IN_ColumnNames ON [dbo].[TableName] ( [ColumnName2], [ColumnName3] ) -- Add a check constraint ALTER TABLE [dbo].[TableName] WITH CHECK ADD CONSTRAINT [CH_Name] CHECK (([ColumnName] >= 0.0000)) END --=============================== -- Add a new column to an existing table --=============================== IF NOT EXISTS (SELECT * FROM INFORMATION_SCHEMA.COLUMNS where TABLE_SCHEMA='dbo' AND TABLE_NAME = 'TableName' AND COLUMN_NAME = 'ColumnName') BEGIN ALTER TABLE [dbo].[TableName] ADD [ColumnName] INT NOT NULL DEFAULT(0) -- Add a description extended property to the column to specify what its purpose is. EXEC sys.sp_addextendedproperty @name=N'MS_Description',  @value = N'Add column comments here, describing what this column is for.' ,  @level0type=N'SCHEMA',@level0name=N'dbo', @level1type=N'TABLE', @level1name = N'TableName', @level2type=N'COLUMN', @level2name = N'ColumnName' END --=============================== -- Drop a table --=============================== IF EXISTS (SELECT * FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_NAME = 'TableName' AND TABLE_SCHEMA='dbo') BEGIN DROP TABLE [dbo].[TableName] END --=============================== -- Drop a view --=============================== IF EXISTS (SELECT * FROM INFORMATION_SCHEMA.VIEWS WHERE TABLE_NAME = 'ViewName' AND TABLE_SCHEMA='dbo') BEGIN DROP VIEW [dbo].[ViewName] END --=============================== -- Drop a column --=============================== IF EXISTS (SELECT * FROM INFORMATION_SCHEMA.COLUMNS where TABLE_SCHEMA='dbo' AND TABLE_NAME = 'TableName' AND COLUMN_NAME = 'ColumnName') BEGIN -- If the column has an extended property, drop it first. IF EXISTS (SELECT * FROM sys.fn_listExtendedProperty(N'MS_Description', N'SCHEMA', N'dbo', N'Table', N'TableName', N'COLUMN', N'ColumnName') BEGIN EXEC sys.sp_dropextendedproperty @name=N'MS_Description',  @level0type=N'SCHEMA',@level0name=N'dbo', @level1type=N'TABLE', @level1name = N'TableName', @level2type=N'COLUMN', @level2name = N'ColumnName' END ALTER TABLE [dbo].[TableName] DROP COLUMN [ColumnName] END --=============================== -- Drop Primary key constraint --=============================== IF EXISTS (SELECT * FROM INFORMATION_SCHEMA.TABLE_CONSTRAINTS WHERE CONSTRAINT_TYPE='PRIMARY KEY' AND TABLE_SCHEMA='dbo' AND TABLE_NAME = 'TableName' AND CONSTRAINT_NAME = 'PK_Name') BEGIN ALTER TABLE [dbo].[TableName] DROP CONSTRAINT [PK_Name] END --=============================== -- Drop Foreign key constraint --=============================== IF EXISTS (SELECT * FROM INFORMATION_SCHEMA.TABLE_CONSTRAINTS WHERE CONSTRAINT_TYPE='FOREIGN KEY' AND TABLE_SCHEMA='dbo' AND TABLE_NAME = 'TableName' AND CONSTRAINT_NAME = 'FK_Name') BEGIN ALTER TABLE [dbo].[TableName] DROP CONSTRAINT [FK_Name] END --=============================== -- Drop Unique key constraint --=============================== IF EXISTS (SELECT * FROM INFORMATION_SCHEMA.TABLE_CONSTRAINTS WHERE CONSTRAINT_TYPE='UNIQUE' AND TABLE_SCHEMA='dbo' AND TABLE_NAME = 'TableName' AND CONSTRAINT_NAME = 'UNI_Name') BEGIN ALTER TABLE [dbo].[TableNames] DROP CONSTRAINT [UNI_Name] END --=============================== -- Drop Check constraint --=============================== IF EXISTS (SELECT * FROM INFORMATION_SCHEMA.TABLE_CONSTRAINTS WHERE CONSTRAINT_TYPE='CHECK' AND TABLE_SCHEMA='dbo' AND TABLE_NAME = 'TableName' AND CONSTRAINT_NAME = 'CH_Name') BEGIN ALTER TABLE [dbo].[TableName] DROP CONSTRAINT [CH_Name] END --=============================== -- Drop a column's Default value constraint --=============================== DECLARE @ConstraintName VARCHAR(100) SET @ConstraintName = (SELECT TOP 1 s.name FROM sys.sysobjects s JOIN sys.syscolumns c ON s.parent_obj=c.id WHERE s.xtype='d' AND c.cdefault=s.id  AND parent_obj = OBJECT_ID('TableName') AND c.name ='ColumnName') IF @ConstraintName IS NOT NULL BEGIN EXEC ('ALTER TABLE [dbo].[TableName] DROP CONSTRAINT ' + @ConstraintName) END --=============================== -- Example of how to drop dynamically named Unique constraint --=============================== DECLARE @ConstraintName VARCHAR(100) SET @ConstraintName = (SELECT TOP 1 CONSTRAINT_NAME FROM INFORMATION_SCHEMA.TABLE_CONSTRAINTS  WHERE CONSTRAINT_TYPE='UNIQUE' AND TABLE_SCHEMA='dbo' AND TABLE_NAME = 'TableName' AND CONSTRAINT_NAME LIKE 'FirstPartOfConstraintName%') IF EXISTS (SELECT * FROM INFORMATION_SCHEMA.TABLE_CONSTRAINTS WHERE CONSTRAINT_TYPE='UNIQUE' AND TABLE_SCHEMA='dbo' AND TABLE_NAME = 'TableName' AND CONSTRAINT_NAME = @ConstraintName) BEGIN EXEC ('ALTER TABLE [dbo].[TableName] DROP CONSTRAINT ' + @ConstraintName) END --=============================== -- Check for and drop a temp table --=============================== IF OBJECT_ID('tempdb..#TableName') IS NOT NULL DROP TABLE #TableName --=============================== -- Drop a stored procedure --=============================== IF EXISTS (SELECT * FROM INFORMATION_SCHEMA.ROUTINES WHERE ROUTINE_TYPE='PROCEDURE' AND ROUTINE_SCHEMA='dbo' AND ROUTINE_NAME = 'StoredProcedureName') BEGIN DROP PROCEDURE [dbo].[StoredProcedureName] END --=============================== -- Drop a UDF --=============================== IF EXISTS (SELECT * FROM INFORMATION_SCHEMA.ROUTINES WHERE ROUTINE_TYPE='FUNCTION' AND ROUTINE_SCHEMA='dbo' AND  ROUTINE_NAME = 'UDFName') BEGIN DROP FUNCTION [dbo].[UDFName] END --=============================== -- Drop an Index --=============================== IF EXISTS (SELECT * FROM SYS.INDEXES WHERE name = 'IndexName') BEGIN DROP INDEX TableName.IndexName END --=============================== -- Drop a Schema --=============================== IF EXISTS (SELECT * FROM INFORMATION_SCHEMA.SCHEMATA WHERE SCHEMA_NAME = 'SchemaName') BEGIN EXEC('DROP SCHEMA SchemaName') END

    Read the article

  • LinqPad with Azure Table Storage

    - by Sarang
    LinqPad as we all know has been a wonderful tool for running ad-hoc queries. With Windows Azure Table storage in picture LinqPad was no longer in picture and we shifted focus to Cloud Storage Studio only to realize the limited and strange querying capabilities of CSS. With some tweaking to Linqpad we can get the comfortable old shoe of ad-hoc queries with LinqPad in the Windows Azure Table storage. Steps: 1. Start LinqPad 2. Right Click in the query window and select “Query Properties” 3. In The Additional References add reference to Microsoft.WindowsAzure.StorageClient, System.Data.Services.Client.dll and the assembly containing the implementation of the DataServiceContext class tied to the Windows Azure table storage. 4. In the additional namespace imports import the same three namespaces mentioned above. 5. Then we need to provide following details. a. Table storage account name and shared key. b. DataServiceContext implementing class in your code. c. A LINQ query. e.x.         var storageAccountName = "myStorageAccount";  // Enter valid storage account name         var storageSharedKey = "mysharedKey"; // Enter valid storage account shared key         var uri = new System.Uri("http://table.core.windows.net/");         var storageAccountInfo = new CloudStorageAccount(new StorageCredentialsAccountKey(storageAccountName, storageSharedKey), false);         var serviceContext = new TweetPollDataServiceContext(storageAccountInfo); // Specify the DataServiceContext implementation         // The query         var query = from row in serviceContext.Table                     select row;         query.Dump(); Thanks LinqPad! Technorati Tags: LinqPad,Azure Table Storage,Linq

    Read the article

  • BizTalk 2009 - The Scope of the Table Looping Functoid

    - by StuartBrierley
    When mapping in BizTalk you will find there are times when you need to map from flat and dispersed elemements in your source schema to a repeated record with child elements in your destination schema.  Below is an example of how you can make use of the Table Looping Functoid to bring together these flat elements and create your repeated group.  Although this example is purposely simple, I have previsouly encounted this issue on a much more complex scale when mapping the response from a credit scoring agency where all the applicant details were supplied in separate parts of a very flat schema. Consider the source and destination schemas as follows:   Although the Table Looping Functoid states that the first input must be a scoping element linked from a repeating group, you can actually also make use of a constant value.  In this case I know that the source schema always contains two people, so I set this to two. Then you need to set the number of columns in your table, in this case 2 (name and sex) and link all the required fields from the source schema. Following this you can configure the table. You can then add the Table Extractor functoids and complete the map. If you now validate this map you will see that BizTalk will warn you about the scoping link for the Table Looping Functoid, but this can be safely ignored. C:\Code\Developer Folders\Stuart Brierley\Test Mapping\TableLooping.btm: warning btm1071: A first input of the Table-Looping functoid must be a link from a Source Tree Node which acts as the scoping parameter. Testing the map will produce the following output:

    Read the article

  • EF4 generates invalid script

    - by Jaxidian
    When I right-click in a .EDMX file and click Generate Database From Model, the resulting script is obviously wrong because of the table names. What it generates is the following script. Note the table names in the DROP TABLE part versus the CREATE TABLE part. Why is this inconsistent? This is obviously not a reusable script. What I created was an Entity named "Address" and an Entity named "Company", etc (all singular). The EntitySet names are pluralized. The "Pluralize New Objects" boolean does not change this either. So what's the deal? For what it's worth, I originally generated the EDMX by pointing it to a database that had tables with non-pluralized names and now that I've made some changes, I want to go back the other way. I'd like to have the option to go back and forth as neither the db-first nor the model-first model is ideal in all scenarios, and I have the control to ensure that there will be no merging issues from multiple people going both ways at the same time. -- -------------------------------------------------- -- Dropping existing FOREIGN KEY constraints -- NOTE: if the constraint does not exist, an ignorable error will be reported. -- -------------------------------------------------- ALTER TABLE [Address] DROP CONSTRAINT [FK_Address_StateID-State_ID]; GO ALTER TABLE [Company] DROP CONSTRAINT [FK_Company_AddressID-Address_ID]; GO ALTER TABLE [Employee] DROP CONSTRAINT [FK_Employee_BossEmployeeID-Employee_ID]; GO ALTER TABLE [Employee] DROP CONSTRAINT [FK_Employee_CompanyID-Company_ID]; GO ALTER TABLE [Employee] DROP CONSTRAINT [FK_Employee_PersonID-Person_ID]; GO ALTER TABLE [Person] DROP CONSTRAINT [FK_Person_AddressID-Address_ID]; GO -- -------------------------------------------------- -- Dropping existing tables -- NOTE: if the table does not exist, an ignorable error will be reported. -- -------------------------------------------------- DROP TABLE [Address]; GO DROP TABLE [Company]; GO DROP TABLE [Employee]; GO DROP TABLE [Person]; GO DROP TABLE [State]; GO -- -------------------------------------------------- -- Creating all tables -- -------------------------------------------------- -- Creating table 'Addresses' CREATE TABLE [Addresses] ( [ID] int IDENTITY(1,1) NOT NULL, [StreetAddress] nvarchar(100) NOT NULL, [City] nvarchar(100) NOT NULL, [StateID] int NOT NULL, [Zip] nvarchar(10) NOT NULL ); GO -- Creating table 'Companies' CREATE TABLE [Companies] ( [ID] int IDENTITY(1,1) NOT NULL, [Name] nvarchar(100) NOT NULL, [AddressID] int NOT NULL ); GO -- Creating table 'People' CREATE TABLE [People] ( [ID] int IDENTITY(1,1) NOT NULL, [FirstName] nvarchar(100) NOT NULL, [LastName] nvarchar(100) NOT NULL, [AddressID] int NOT NULL ); GO -- Creating table 'States' CREATE TABLE [States] ( [ID] int IDENTITY(1,1) NOT NULL, [Name] nvarchar(100) NOT NULL, [Abbreviation] nvarchar(2) NOT NULL ); GO -- Creating table 'Employees' CREATE TABLE [Employees] ( [ID] int IDENTITY(1,1) NOT NULL, [PersonID] int NOT NULL, [CompanyID] int NOT NULL, [Position] nvarchar(100) NOT NULL, [BossEmployeeID] int NULL ); GO -- -------------------------------------------------- -- Creating all PRIMARY KEY constraints -- -------------------------------------------------- -- Creating primary key on [ID] in table 'Addresses' ALTER TABLE [Addresses] ADD CONSTRAINT [PK_Addresses] PRIMARY KEY ([ID] ); GO -- Creating primary key on [ID] in table 'Companies' ALTER TABLE [Companies] ADD CONSTRAINT [PK_Companies] PRIMARY KEY ([ID] ); GO -- Creating primary key on [ID] in table 'People' ALTER TABLE [People] ADD CONSTRAINT [PK_People] PRIMARY KEY ([ID] ); GO -- Creating primary key on [ID] in table 'States' ALTER TABLE [States] ADD CONSTRAINT [PK_States] PRIMARY KEY ([ID] ); GO -- Creating primary key on [ID] in table 'Employees' ALTER TABLE [Employees] ADD CONSTRAINT [PK_Employees] PRIMARY KEY ([ID] ); GO -- -------------------------------------------------- -- Creating all FOREIGN KEY constraints -- -------------------------------------------------- -- Creating foreign key on [StateID] in table 'Addresses' ALTER TABLE [Addresses] ADD CONSTRAINT [FK_Address_StateID_State_ID] FOREIGN KEY ([StateID]) REFERENCES [States] ([ID]) ON DELETE NO ACTION ON UPDATE NO ACTION; -- Creating non-clustered index for FOREIGN KEY 'FK_Address_StateID_State_ID' CREATE INDEX [IX_FK_Address_StateID_State_ID] ON [Addresses] ([StateID]); GO -- Creating foreign key on [AddressID] in table 'Companies' ALTER TABLE [Companies] ADD CONSTRAINT [FK_Company_AddressID_Address_ID] FOREIGN KEY ([AddressID]) REFERENCES [Addresses] ([ID]) ON DELETE NO ACTION ON UPDATE NO ACTION; -- Creating non-clustered index for FOREIGN KEY 'FK_Company_AddressID_Address_ID' CREATE INDEX [IX_FK_Company_AddressID_Address_ID] ON [Companies] ([AddressID]); GO -- Creating foreign key on [AddressID] in table 'People' ALTER TABLE [People] ADD CONSTRAINT [FK_Person_AddressID_Address_ID] FOREIGN KEY ([AddressID]) REFERENCES [Addresses] ([ID]) ON DELETE NO ACTION ON UPDATE NO ACTION; -- Creating non-clustered index for FOREIGN KEY 'FK_Person_AddressID_Address_ID' CREATE INDEX [IX_FK_Person_AddressID_Address_ID] ON [People] ([AddressID]); GO -- Creating foreign key on [CompanyID] in table 'Employees' ALTER TABLE [Employees] ADD CONSTRAINT [FK_Employee_CompanyID_Company_ID] FOREIGN KEY ([CompanyID]) REFERENCES [Companies] ([ID]) ON DELETE NO ACTION ON UPDATE NO ACTION; -- Creating non-clustered index for FOREIGN KEY 'FK_Employee_CompanyID_Company_ID' CREATE INDEX [IX_FK_Employee_CompanyID_Company_ID] ON [Employees] ([CompanyID]); GO -- Creating foreign key on [BossEmployeeID] in table 'Employees' ALTER TABLE [Employees] ADD CONSTRAINT [FK_Employee_BossEmployeeID_Employee_ID] FOREIGN KEY ([BossEmployeeID]) REFERENCES [Employees] ([ID]) ON DELETE NO ACTION ON UPDATE NO ACTION; -- Creating non-clustered index for FOREIGN KEY 'FK_Employee_BossEmployeeID_Employee_ID' CREATE INDEX [IX_FK_Employee_BossEmployeeID_Employee_ID] ON [Employees] ([BossEmployeeID]); GO -- Creating foreign key on [PersonID] in table 'Employees' ALTER TABLE [Employees] ADD CONSTRAINT [FK_Employee_PersonID_Person_ID] FOREIGN KEY ([PersonID]) REFERENCES [People] ([ID]) ON DELETE NO ACTION ON UPDATE NO ACTION; -- Creating non-clustered index for FOREIGN KEY 'FK_Employee_PersonID_Person_ID' CREATE INDEX [IX_FK_Employee_PersonID_Person_ID] ON [Employees] ([PersonID]); GO -- -------------------------------------------------- -- Script has ended -- --------------------------------------------------

    Read the article

  • Pull Request Changes, Multi-Selection in Advanced View, and Advertisement Changes

    [Do you tweet? Follow us on Twitter @matthawley and @adacole_msft] We deployed a new version of the CodePlex website today. Pull Request Changes In this release, we have begun to re-focus on Pull Requests to ensure a productive experience between the project users and developers. We feel we made significant progress in this area for this release and look forward to using your feedback to drive future iterations. One of the biggest hurdles people have indicated is the inability to see what a pull request includes without pulling the source down from a Mercurial client. With today’s changes, any user has the ability to view a pull request, the changesets / changes included, and perform an inline diff of the file. When a pull request is made, the CodePlex website will query for all outgoing changes from the fork to the main repository for a point-in-time comparison. Because of this point-in-time comparison… All existing pull requests created prior to this release will not have changesets associated with them. If new commits are pushed to the fork while a pull request is active, they will not appear associated with the pull request. The pull request will need to be re-submitted for them to appear. Once a pull request is created, you can “View the Pull Request” which takes you to a page that looks like As you may notice, we now display a lot more detailed information regarding that pull request including who it was requested by and when, the associated changesets, the description, who it’s assigned to (we’ll come back to this) and the listing of summarized file changes. What you’ll also notice, is that each modified file has the ability to view a diff of all changes made. When you click “(view diff)” for a file, an inline diff experience appears. This new experience allows you to quickly navigate through all of the modified files as well as viewing the various change blocks for each file. You’ll also notice as you browse through each file’s changes, we update the URL to include the file path so you can quickly send a direct link to a pull request’s file. Clicking “(close diff)” will bring you back to the original pull request view. View this pull request live on WikiPlex. Pull Request Review Assignment Another new feature we added for pull requests is the ability for project members to assign pull requests for review. Any project member has the ability to assign (and re-assign if needed) a pull request to a project member. Once the assignment has been made, that project member will be notified via email of the assignment. Once they complete the review of the pull request, they can either accept or deny it similarly to the previous process. Multi-Selection in Advanced View Filters One of the more recent requests we have heard from users is the ability multi-select advanced view filters for work items. We are happy to announce this is now possible. Simply control-click the multiple options for each filter item and your work item query will be refined as such. Should you happen to unselect all options for a given filter, it will automatically reset to the default option for that filter. Furthermore, the “Direct Link” URL will be updated to include the multi-selected options for each filter. Note: The “Direct Link” feature was released in our previous deployment, just never written about. It allows you to capture the current state of your query and send it to other individuals. Advertisement Changes Very recently, the advertiser (The Lounge) we partnered to provide advertising revenue for projects, or donated to charity, was acquired by Lake Quincy Media. There has been no change in the advertising platform offering, and all projects have been converted over to using the new infrastructure. Project owners should note the new contact information for getting paid. The CodePlex team values your feedback, and is frequently monitoring Twitter, our Discussions and Issue Tracker for new features or problems. If you’ve not visited the Issue Tracker recently, please take a few moments to log an idea or vote for the features you would most like to see implemented on CodePlex.

    Read the article

  • correct way to create a pivot table in postgresql using CASE WHEN

    - by mojones
    I am trying to create a pivot table type view in postgresql and am nearly there! Here is the basic query: select acc2tax_node.acc, tax_node.name, tax_node.rank from tax_node, acc2tax_node where tax_node.taxid=acc2tax_node.taxid and acc2tax_node.acc='AJ012531'; And the data: acc | name | rank ----------+-------------------------+-------------- AJ012531 | Paromalostomum fusculum | species AJ012531 | Paromalostomum | genus AJ012531 | Macrostomidae | family AJ012531 | Macrostomida | order AJ012531 | Macrostomorpha | no rank AJ012531 | Turbellaria | class AJ012531 | Platyhelminthes | phylum AJ012531 | Acoelomata | no rank AJ012531 | Bilateria | no rank AJ012531 | Eumetazoa | no rank AJ012531 | Metazoa | kingdom AJ012531 | Fungi/Metazoa group | no rank AJ012531 | Eukaryota | superkingdom AJ012531 | cellular organisms | no rank What I am trying to get is the following: acc | species | phylum AJ012531 | Paromalostomum fusculum | Platyhelminthes I am trying to do this with CASE WHEN, so I've got as far as the following: select acc2tax_node.acc, CASE tax_node.rank WHEN 'species' THEN tax_node.name ELSE NULL END as species, CASE tax_node.rank WHEN 'phylum' THEN tax_node.name ELSE NULL END as phylum from tax_node, acc2tax_node where tax_node.taxid=acc2tax_node.taxid and acc2tax_node.acc='AJ012531'; Which gives me the output: acc | species | phylum ----------+-------------------------+----------------- AJ012531 | Paromalostomum fusculum | AJ012531 | | AJ012531 | | AJ012531 | | AJ012531 | | AJ012531 | | AJ012531 | | Platyhelminthes AJ012531 | | AJ012531 | | AJ012531 | | AJ012531 | | AJ012531 | | AJ012531 | | AJ012531 | | Now I know that I have to group by acc at some point, so I try select acc2tax_node.acc, CASE tax_node.rank WHEN 'species' THEN tax_node.name ELSE NULL END as sp, CASE tax_node.rank WHEN 'phylum' THEN tax_node.name ELSE NULL END as ph from tax_node, acc2tax_node where tax_node.taxid=acc2tax_node.taxid and acc2tax_node.acc='AJ012531' group by acc2tax_node.acc; But I get the dreaded ERROR: column "tax_node.rank" must appear in the GROUP BY clause or be used in an aggregate function All the previous examples I've been able to find use something like SUM() around the CASE statements, so I guess that is the aggregate function. I have tried using FIRST(): select acc2tax_node.acc, FIRST(CASE tax_node.rank WHEN 'species' THEN tax_node.name ELSE NULL END) as sp, FIRST(CASE tax_node.rank WHEN 'phylum' THEN tax_node.name ELSE NULL END) as ph from tax_node, acc2tax_node where tax_node.taxid=acc2tax_node.taxid and acc2tax_node.acc='AJ012531' group by acc2tax_node.acc; but get the error: ERROR: function first(character varying) does not exist Can anyone offer any hints?

    Read the article

  • joining table of oracle

    - by Deven
    Hi friends i am having problem in joining two tables in oracle my two tables are shown bellow table1 looks like id        Name      Jan 7001    Deven    22 7002    Clause    55 7004    Monish    11 table2 looks like id        Name      Feb 7001    Deven    12 7002    Clause    15 7003    Nimesh    20 7004    Monish    21 7005    Ritesh    22 i want to combine this two table and want answer like bellow table2 looks like id        Name      Jan    Feb 7001    Deven    22     12 7002    Clause   55     15 7003    Nimesh    -       20 7004    Monish   11     21 7005    Ritesh    -        22

    Read the article

  • Using set with values from a table

    - by gozzilli
    I'm writing a database of all DVDs I have at home. One of the fields, actors, I would like it to be a set of values from an other table, which is storing actors. So for every film I want to store a list of actors, all of which selected from a list of actors, taken from a different table. Is it possible? How do I do this? It would be a set of foreign keys basically. I'm using a MySQL database for a Django application (python), so any hint in SQL or Python would be much appreciated. I hope the question is clear, many thanks.

    Read the article

  • Table Row Spacing Problem in IE

    - by Brij
    Viewing the code below in IE displays spacing between the rows. I want to join the rows. In Firefox, It is working fine. <table border="0" cellspacing='0' cellpadding='0' width="720" cols="2"> <tr> <td colspan="2"> <a href="index.html"> <img src="images/banner.gif" border="0"> </a> </td> </tr> <tr valign="top"> <td width="130"> <img name="navigate" src="images/navbar.jpg" border="0"> </td> ..... I have also tried style="margin:0; padding:0;" for tr and td but there is no effect in IE. Let me know what to do to remove spacing between rows. Thanks

    Read the article

  • Using a permutation table for simplex noise without storing it

    - by J. C. Leitão
    Generating Simplex noise requires a permutation table for randomisation (e.g. see this question or this example). In some applications, we need to persist the state of the permutation table. This can be done by creating the table, e.g. using def permutation_table(seed): table_size = 2**10 # arbitrary for this question l = range(1, table_size + 1) random.seed(seed) # ensures the same shuffle for a given seed random.shuffle(l) return l + l # see shared link why l + l; is a detail and storing it. Can we avoid storing the full table by generating the required elements every time they are required? Specifically, currently I store the table and call it using table[i] (table is a list). Can I avoid storing it by having a function that computes the element i, e.g. get_table_element(seed, i). I'm aware that cryptography already solved this problem using block cyphers, however, I found it too complex to go deep and implement a block cypher. Does anyone knows a simple implementation of a block cypher to this problem?

    Read the article

  • Populate table with JQuery

    - by Talkar
    I need to populate some data into a table. The data i have is something i get in response from my ASP.NET MVC site, when i make a json post call there. Yet i can't seem to find a way to actually display the data i get back in response. Here is my code so far. Any help would be much appreciated: $(document).ready(function () { var uName = '<%= Session["UserName"].ToString()%>'; var pWord = '<%= Session["Password"].ToString()%>'; var data = { UserName:uName,Password:pWord}; $.ajax( { type: 'POST', url: "http://someurl.goes.here/", crossDomain: true, data: data, dataType: 'jsonp', success: function(myData) { $.each(myData, function (index, element) { $("#ClassTable").append('<tr><td> ' + element[0] + ' </td> <td> ' + element[1] + '</td></tr>'); }) } }); }); myData looks like this: [Object { IsActive = True, ObjectId=1, ObjectString="someString", etc... etc... } ]

    Read the article

  • Table overflow working in Chrome and IE but not Firefox

    - by Craig
    I am trying to get a layout that always takes up the entire screen, no more, no less. The layout has a header row, a 200px wide left bar (scrollable), and a scrollable content area. This works in Chrome and IE, but in Firefox the scroll bars never show nor work. Any thoughts? <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN"> <html> <head> <style type="text/css"> * { margin: 0; padding: 0; } html, body { height: 100%; background-color: yellow; overflow: hidden; } #viewTable { width: 100%; height: 100%; background-color: red; } #header { height: 72px; background-color: blue; } #leftcol { vertical-align: top; width: 200px; height: 100%; background-color: green; } #menu { height: 100%; overflow: auto; } #rightcol { vertical-align: top; width: auto; height: 100%; background-color: purple; } #content { height: 100%; overflow: auto; } </style> </head> <body> </body> <table id="viewTable" border="0" cellpadding="0" cellspacing="0"> <tr> <td colspan="2" id="header"> Header </td> </tr> <tr> <td id="leftcol"> <div id="menu"> 0<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 100<br/> </div> </td> <td id="rightcol"> <div id="content"> 0<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 1<br/> 100<br/> </div> </td> </tr> </table> hi </html> I would have preferred to use CSS, but could not find any way to do it. The hi should no show, it is simply there to verify it does not. Thank you!

    Read the article

  • jQuery Reference First Column in HTML Table

    - by Vic
    I have a table where all of the cells are INPUT tags. I have a function which looks for the first input cell and replaces it with it's value. So this: <tr id="row_0" class="datarow"> <td><input class="tabcell" value="Injuries"></td> <td><input class="tabcell" value="01"></td> becomes this: <tr id="row_0" class="datarow"> <td>Injuries</td> <td><input class="tabcell" value="01"></td> Here is the first part of the function: function setRowLabels() { var row = []; $('.dataRow').each(function(i) { row.push($('td input:eq(0)', this).val() + ' - '); $('td input:eq(0)', this).replaceWith($('td input:eq(0)', this).val()); $('td input:gt(0)', this).each(function(e) { etcetera But when the page reloads, the first column is not an input type, so it changes the second column to text too! Can I tell it to only change the first column, no matter what the type is? I tried $('td:eq(0)', this).replaceWith($('td:eq(0)', this).val()); but it does not work. Any suggestions appreciated! Thanks

    Read the article

  • Is this a bad indexing strategy for a table?

    - by llamaoo7
    The table in question is part of a database that a vendor's software uses on our network. The table contains metadata about files. The schema of the table is as follows Metadata ResultID (PK, int, not null) MappedFieldname (char(50), not null) Fieldname (PK, char(50), not null) Fieldvalue (text, null) There is a clustered index on ResultID and Fieldname. This table typically contains millions of rows (in one case, it contains 500 million). The table is populated by 24 workers running 4 threads each when data is being "processed". This results in many non-sequential inserts. Later after processing, more data is inserted into this table by some of our in-house software. The fragmentation for a given table is at least 50%. In the case of the largest table, it is at 90%. We do not have a DBA. I am aware we desperately need a DB maintenance strategy. As far as my background, I'm a college student working part time at this company. My question is this, is a clustered index the best way to go about this? Should another index be considered? Are there any good references for this type and similar ad-hoc DBA tasks?

    Read the article

  • The [2] table entry '[3]' has no associated entry in the Media table. (error 2602)

    - by derekf
    Coworker started getting the above message in the event log and as dialog during install.  Argument [2] was File and argument [3] was a specific file. Error dialog read   Product: (app name) -- The installer has encountered an unexpected error installing this package. This may indicate a problem with this package. The error code is 2602. Package was a vendor-provided MSI that had been installed administratively, and then a patch (.msp) applied to the administrative install point. With some digging we found that the MSI still had the entries in the media table pointing at the CAB files, and that there were several files at the end of the sequence that did not have corresponding entries in the Media table (last sequence 990 in Media table, last entry in File table had sequence 994).  Attributes on files in the File table all had the msidbFileAttributesCompressed (&16384) attribute set, so they were all expecting to be within the CAB files, but since this was an admin install there were no CAB files. Resolved by clearing the Media table (replace with a single entry: Disk ID 1, LastSequence 994) and going through the file table and subtracting 8192 from each entry to mark files as not compressed.  Tested and worked.

    Read the article

  • validate uniqueness amongst multiple subclasses with Single Table Inheritance

    - by irkenInvader
    I have a Card model that has many Sets and a Set model that has many Cards through a Membership model: class Card < ActiveRecord::Base has_many :memberships has_many :sets, :through => :memberships end class Membership < ActiveRecord::Base belongs_to :card belongs_to :set validates_uniqueness_of :card_id, :scope => :set_id end class Set < ActiveRecord::Base has_many :memberships has_many :cards, :through => :memberships validates_presence_of :cards end I also have some sub-classes of the above using Single Table Inheritance: class FooCard < Card end class BarCard < Card end and class Expansion < Set end class GameSet < Set validates_size_of :cards, :is => 10 end All of the above is working as I intend. What I'm trying to figure out is how to validate that a Card can only belong to a single Expansion. I want the following to be invalid: some_cards = FooCard.all( :limit => 25 ) first_expansion = Expansion.new second_expansion = Expansion.new first_expansion.cards = some_cards second_expansion.cards = some_cards first_expansion.save # Valid second_expansion.save # **Should be invalid** However, GameSets should allow this behavior: other_cards = FooCard.all( :limit => 10 ) first_set = GameSet.new second_set = GameSet.new first_set.cards = other_cards # Valid second_set.cards = other_cards # Also valid I'm guessing that a validates_uniqueness_of call is needed somewhere, but I'm not sure where to put it. Any suggestions? UPDATE 1 I modified the Expansion class as sugested: class Expansion < Set validate :validates_uniqueness_of_cards def validates_uniqueness_of_cards membership = Membership.find( :first, :include => :set, :conditions => [ "card_id IN (?) AND sets.type = ?", self.cards.map(&:id), "Expansion" ] ) errors.add_to_base("a Card can only belong to a single Expansion") unless membership.nil? end end This works when creating initial expansions to validate that no current expansions contain the cards. However, this (falsely) invalidates future updates to the expansion with new cards. In other words: old_exp = Expansion.find(1) old_exp.card_ids # returns [1,2,3,4,5] new_exp = Expansion.new new_exp.card_ids = [6,7,8,9,10] new_exp.save # returns true new_exp.card_ids << [11,12] # no other Expansion contains these cards new_exp.valid? # returns false ... SHOULD be true

    Read the article

  • Convert VARCHAR() columns to NVARCHAR()

    - by ChrisD
    We recently underwent an upgrade that required us to change our database columns from varchar to NVarchar, to support unicode characters. Digging through the internet, I found a base script which I modified to handle reserved word table names, and maintain the NULL/NotNull constraint of the columns.   I Ran this script use NWOperationalContent – Your Catalog Name here GO SELECT 'ALTER TABLE ' + isnull(schema_name(syo.id), 'dbo') + '.[' +  syo.name +'] '     + ' ALTER COLUMN [' + syc.name + '] NVARCHAR(' + case syc.length when -1 then 'MAX'         ELSE convert(nvarchar(10),syc.length) end + ') '+         case  syc.isnullable when 1 then ' NULL' ELSE ' NOT NULL' END +';'    FROM sysobjects syo    JOIN syscolumns syc ON      syc.id = syo.id    JOIN systypes syt ON      syt.xtype = syc.xtype    WHERE      syt.name = 'varchar'     and syo.xtype='U'   which produced a series of ALTER statements which I could then execute the tables.  In some cases I had to drop indexes, alter the tables, and re-create the indexes.  There might have been a better way to do that, but manually dropping them got the job done.   use NWMerchandisingContent GO ALTER TABLE Locale Drop Constraint PK_Locale ALTER TABLE Country DROP CONSTRAINT PK_Country GO ALTER TABLE dbo.[Campaign]  ALTER COLUMN [ActorKey] NVARCHAR(200)  NOT NULL; ALTER TABLE dbo.[BundleLocalization]  ALTER COLUMN [Locale] NVARCHAR(8)  NOT NULL; ALTER TABLE dbo.[BundleLocalization]  ALTER COLUMN [UnitOfmeasure] NVARCHAR(200)  NULL; ALTER TABLE dbo.[BundleLocalization]  ALTER COLUMN [ActorKey] NVARCHAR(200)  NOT NULL; ALTER TABLE dbo.[BundleComponentLocalization]  ALTER COLUMN [Locale] NVARCHAR(8)  NOT NULL; ALTER TABLE dbo.[BundleComponentLocalization]  ALTER COLUMN [Imperative] NVARCHAR(MAX)  NULL; ALTER TABLE dbo.[BundleComponentLocalization]  ALTER COLUMN [Instructions] NVARCHAR(MAX)  NULL; ALTER TABLE dbo.[BundleComponentLocalization]  ALTER COLUMN [ActorKey] NVARCHAR(200)  NOT NULL; ALTER TABLE dbo.[BundleComponent]  ALTER COLUMN [ActorKey] NVARCHAR(200)  NOT NULL; ALTER TABLE dbo.[Bundle]  ALTER COLUMN [ActorKey] NVARCHAR(200)  NOT NULL; ALTER TABLE dbo.[Banner]  ALTER COLUMN [ActorKey] NVARCHAR(200)  NOT NULL; ALTER TABLE dbo.[Video]  ALTER COLUMN [Link] NVARCHAR(512)  NOT NULL; ALTER TABLE dbo.[Video]  ALTER COLUMN [ActorKey] NVARCHAR(200)  NOT NULL; ALTER TABLE dbo.[ProductUsage]  ALTER COLUMN [VideoLink] NVARCHAR(512)  NOT NULL; ALTER TABLE dbo.[ProductUsage]  ALTER COLUMN [ActorKey] NVARCHAR(200)  NOT NULL; ALTER TABLE dbo.[Thumbnail]  ALTER COLUMN [ActorKey] NVARCHAR(200)  NOT NULL; ALTER TABLE dbo.[SkuLocalization]  ALTER COLUMN [Locale] NVARCHAR(8)  NOT NULL; ALTER TABLE dbo.[SkuLocalization]  ALTER COLUMN [UnitOfMeasure] NVARCHAR(150)  NOT NULL; ALTER TABLE dbo.[SkuLocalization]  ALTER COLUMN [SwatchColor] NVARCHAR(50)  NOT NULL; etc.. GO ALTER TABLE Locale ADD CONSTRAINT PK_Locale PRIMARY KEY (LocaleId) ALTER TABLE Country ADD CONSTRAINT PK_Country PRIMARY KEY (CountryId) Note that this alter is non-destructive to the data.   Hope this helps.

    Read the article

< Previous Page | 99 100 101 102 103 104 105 106 107 108 109 110  | Next Page >