Search Results

Search found 63114 results on 2525 pages for 'set identity insert'.

Page 191/2525 | < Previous Page | 187 188 189 190 191 192 193 194 195 196 197 198  | Next Page >

  • OS X - Automatically Set Execute Permissions for New Files?

    - by i help X u
    I'm using OS X 10.6.4 and am trying to set a folder to automatically enable execute permissions on new script files copied or created in a directory. I have used Sandbox 2 to set every permission for the folder to enabled with sticky bits and the inherit flag set but I still have to manually set the execute flag using chmod for every new flag. I've done: chmod -R a+rwxs ~/scripts I've done: chmod 7777 ~/scripts And the permissions for the folder show as: drwsrwsrwt+ for the folder. But if I add a new script file it's set to "-rw-r--r--+" (the default) I looked at setting "unmask 000" in the .profile file but the default value for files is 666 with an unmask of 022 so that's not relevant since I would need a default value of 777 for files. I have figure out how to use chmod in an AppleScript triggered by a folder action to automate this but I'm wondering if there is a simple ACL or chmod setting I'm missing. So, is there a way to automatically set execute permission for new files? (Without using a folder action and AppleScript?)

    Read the article

  • Bulk inserting best way to about it? + Helping me understand fully what I found so far

    - by chobo2
    Hi So I saw this post here and read it and it seems like bulk copy might be the way to go. http://stackoverflow.com/questions/682015/whats-the-best-way-to-bulk-database-inserts-from-c I still have some questions and want to know how things actually work. So I found 2 tutorials. http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx First way uses 2 ado.net 2.0 features. BulkInsert and BulkCopy. the second one uses linq to sql and OpenXML. This sort of appeals to me as I am using linq to sql already and prefer it over ado.net. However as one person pointed out in the posts what he just going around the issue at the cost of performance( nothing wrong with that in my opinion) First I will talk about the 2 ways in the first tutorial I am using VS2010 Express, .net 4.0, MVC 2.0, SQl Server 2005 Is ado.net 2.0 the most current version? Based on the technology I am using, is there some updates to what I am going to show that would improve it somehow? Is there any thing that these tutorial left out that I should know about? BulkInsert I am using this table for all the examples. CREATE TABLE [dbo].[TBL_TEST_TEST] ( ID INT IDENTITY(1,1) PRIMARY KEY, [NAME] [varchar](50) ) SP Code USE [Test] GO /****** Object: StoredProcedure [dbo].[sp_BatchInsert] Script Date: 05/19/2010 15:12:47 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO ALTER PROCEDURE [dbo].[sp_BatchInsert] (@Name VARCHAR(50) ) AS BEGIN INSERT INTO TBL_TEST_TEST VALUES (@Name); END C# Code /// <summary> /// Another ado.net 2.0 way that uses a stored procedure to do a bulk insert. /// Seems slower then "BatchBulkCopy" way and it crashes when you try to insert 500,000 records in one go. /// http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 /// </summary> private static void BatchInsert() { // Get the DataTable with Rows State as RowState.Added DataTable dtInsertRows = GetDataTable(); SqlConnection connection = new SqlConnection(connectionString); SqlCommand command = new SqlCommand("sp_BatchInsert", connection); command.CommandType = CommandType.StoredProcedure; command.UpdatedRowSource = UpdateRowSource.None; // Set the Parameter with appropriate Source Column Name command.Parameters.Add("@Name", SqlDbType.VarChar, 50, dtInsertRows.Columns[0].ColumnName); SqlDataAdapter adpt = new SqlDataAdapter(); adpt.InsertCommand = command; // Specify the number of records to be Inserted/Updated in one go. Default is 1. adpt.UpdateBatchSize = 1000; connection.Open(); int recordsInserted = adpt.Update(dtInsertRows); connection.Close(); } So first thing is the batch size. Why would you set a batch size to anything but the number of records you are sending? Like I am sending 500,000 records so I did a Batch size of 500,000. Next why does it crash when I do this? If I set it to 1000 for batch size it works just fine. System.Data.SqlClient.SqlException was unhandled Message="A transport-level error has occurred when sending the request to the server. (provider: Shared Memory Provider, error: 0 - No process is on the other end of the pipe.)" Source=".Net SqlClient Data Provider" ErrorCode=-2146232060 Class=20 LineNumber=0 Number=233 Server="" State=0 StackTrace: at System.Data.Common.DbDataAdapter.UpdatedRowStatusErrors(RowUpdatedEventArgs rowUpdatedEvent, BatchCommandInfo[] batchCommands, Int32 commandCount) at System.Data.Common.DbDataAdapter.UpdatedRowStatus(RowUpdatedEventArgs rowUpdatedEvent, BatchCommandInfo[] batchCommands, Int32 commandCount) at System.Data.Common.DbDataAdapter.Update(DataRow[] dataRows, DataTableMapping tableMapping) at System.Data.Common.DbDataAdapter.UpdateFromDataTable(DataTable dataTable, DataTableMapping tableMapping) at System.Data.Common.DbDataAdapter.Update(DataTable dataTable) at TestIQueryable.Program.BatchInsert() in C:\Users\a\Downloads\TestIQueryable\TestIQueryable\TestIQueryable\Program.cs:line 124 at TestIQueryable.Program.Main(String[] args) in C:\Users\a\Downloads\TestIQueryable\TestIQueryable\TestIQueryable\Program.cs:line 16 InnerException: Time it took to insert 500,000 records with insert batch size of 1000 took "2 mins and 54 seconds" Of course this is no official time I sat there with a stop watch( I am sure there are better ways but was too lazy to look what they where) So I find that kinda slow compared to all my other ones(expect the linq to sql insert one) and I am not really sure why. Next I looked at bulkcopy /// <summary> /// An ado.net 2.0 way to mass insert records. This seems to be the fastest. /// http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 /// </summary> private static void BatchBulkCopy() { // Get the DataTable DataTable dtInsertRows = GetDataTable(); using (SqlBulkCopy sbc = new SqlBulkCopy(connectionString, SqlBulkCopyOptions.KeepIdentity)) { sbc.DestinationTableName = "TBL_TEST_TEST"; // Number of records to be processed in one go sbc.BatchSize = 500000; // Map the Source Column from DataTabel to the Destination Columns in SQL Server 2005 Person Table // sbc.ColumnMappings.Add("ID", "ID"); sbc.ColumnMappings.Add("NAME", "NAME"); // Number of records after which client has to be notified about its status sbc.NotifyAfter = dtInsertRows.Rows.Count; // Event that gets fired when NotifyAfter number of records are processed. sbc.SqlRowsCopied += new SqlRowsCopiedEventHandler(sbc_SqlRowsCopied); // Finally write to server sbc.WriteToServer(dtInsertRows); sbc.Close(); } } This one seemed to go really fast and did not even need a SP( can you use SP with bulk copy? If you can would it be better?) BatchCopy had no problem with a 500,000 batch size.So again why make it smaller then the number of records you want to send? I found that with BatchCopy and 500,000 batch size it took only 5 seconds to complete. I then tried with a batch size of 1,000 and it only took 8 seconds. So much faster then the bulkinsert one above. Now I tried the other tutorial. USE [Test] GO /****** Object: StoredProcedure [dbo].[spTEST_InsertXMLTEST_TEST] Script Date: 05/19/2010 15:39:03 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO ALTER PROCEDURE [dbo].[spTEST_InsertXMLTEST_TEST](@UpdatedProdData nText) AS DECLARE @hDoc int exec sp_xml_preparedocument @hDoc OUTPUT,@UpdatedProdData INSERT INTO TBL_TEST_TEST(NAME) SELECT XMLProdTable.NAME FROM OPENXML(@hDoc, 'ArrayOfTBL_TEST_TEST/TBL_TEST_TEST', 2) WITH ( ID Int, NAME varchar(100) ) XMLProdTable EXEC sp_xml_removedocument @hDoc C# code. /// <summary> /// This is using linq to sql to make the table objects. /// It is then serailzed to to an xml document and sent to a stored proedure /// that then does a bulk insert(I think with OpenXML) /// http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx /// </summary> private static void LinqInsertXMLBatch() { using (TestDataContext db = new TestDataContext()) { TBL_TEST_TEST[] testRecords = new TBL_TEST_TEST[500000]; for (int count = 0; count < 500000; count++) { TBL_TEST_TEST testRecord = new TBL_TEST_TEST(); testRecord.NAME = "Name : " + count; testRecords[count] = testRecord; } StringBuilder sBuilder = new StringBuilder(); System.IO.StringWriter sWriter = new System.IO.StringWriter(sBuilder); XmlSerializer serializer = new XmlSerializer(typeof(TBL_TEST_TEST[])); serializer.Serialize(sWriter, testRecords); db.insertTestData(sBuilder.ToString()); } } So I like this because I get to use objects even though it is kinda redundant. I don't get how the SP works. Like I don't get the whole thing. I don't know if OPENXML has some batch insert under the hood but I do not even know how to take this example SP and change it to fit my tables since like I said I don't know what is going on. I also don't know what would happen if the object you have more tables in it. Like say I have a ProductName table what has a relationship to a Product table or something like that. In linq to sql you could get the product name object and make changes to the Product table in that same object. So I am not sure how to take that into account. I am not sure if I would have to do separate inserts or what. The time was pretty good for 500,000 records it took 52 seconds The last way of course was just using linq to do it all and it was pretty bad. /// <summary> /// This is using linq to sql to to insert lots of records. /// This way is slow as it uses no mass insert. /// Only tried to insert 50,000 records as I did not want to sit around till it did 500,000 records. /// http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx /// </summary> private static void LinqInsertAll() { using (TestDataContext db = new TestDataContext()) { db.CommandTimeout = 600; for (int count = 0; count < 50000; count++) { TBL_TEST_TEST testRecord = new TBL_TEST_TEST(); testRecord.NAME = "Name : " + count; db.TBL_TEST_TESTs.InsertOnSubmit(testRecord); } db.SubmitChanges(); } } I did only 50,000 records and that took over a minute to do. So I really narrowed it done to the linq to sql bulk insert way or bulk copy. I am just not sure how to do it when you have relationship for either way. I am not sure how they both stand up when doing updates instead of inserts as I have not gotten around to try it yet. I don't think I will ever need to insert/update more than 50,000 records at one type but at the same time I know I will have to do validation on records before inserting so that will slow it down and that sort of makes linq to sql nicer as your got objects especially if your first parsing data from a xml file before you insert into the database. Full C# code using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.Xml.Serialization; using System.Data; using System.Data.SqlClient; namespace TestIQueryable { class Program { private static string connectionString = ""; static void Main(string[] args) { BatchInsert(); Console.WriteLine("done"); } /// <summary> /// This is using linq to sql to to insert lots of records. /// This way is slow as it uses no mass insert. /// Only tried to insert 50,000 records as I did not want to sit around till it did 500,000 records. /// http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx /// </summary> private static void LinqInsertAll() { using (TestDataContext db = new TestDataContext()) { db.CommandTimeout = 600; for (int count = 0; count < 50000; count++) { TBL_TEST_TEST testRecord = new TBL_TEST_TEST(); testRecord.NAME = "Name : " + count; db.TBL_TEST_TESTs.InsertOnSubmit(testRecord); } db.SubmitChanges(); } } /// <summary> /// This is using linq to sql to make the table objects. /// It is then serailzed to to an xml document and sent to a stored proedure /// that then does a bulk insert(I think with OpenXML) /// http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx /// </summary> private static void LinqInsertXMLBatch() { using (TestDataContext db = new TestDataContext()) { TBL_TEST_TEST[] testRecords = new TBL_TEST_TEST[500000]; for (int count = 0; count < 500000; count++) { TBL_TEST_TEST testRecord = new TBL_TEST_TEST(); testRecord.NAME = "Name : " + count; testRecords[count] = testRecord; } StringBuilder sBuilder = new StringBuilder(); System.IO.StringWriter sWriter = new System.IO.StringWriter(sBuilder); XmlSerializer serializer = new XmlSerializer(typeof(TBL_TEST_TEST[])); serializer.Serialize(sWriter, testRecords); db.insertTestData(sBuilder.ToString()); } } /// <summary> /// An ado.net 2.0 way to mass insert records. This seems to be the fastest. /// http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 /// </summary> private static void BatchBulkCopy() { // Get the DataTable DataTable dtInsertRows = GetDataTable(); using (SqlBulkCopy sbc = new SqlBulkCopy(connectionString, SqlBulkCopyOptions.KeepIdentity)) { sbc.DestinationTableName = "TBL_TEST_TEST"; // Number of records to be processed in one go sbc.BatchSize = 500000; // Map the Source Column from DataTabel to the Destination Columns in SQL Server 2005 Person Table // sbc.ColumnMappings.Add("ID", "ID"); sbc.ColumnMappings.Add("NAME", "NAME"); // Number of records after which client has to be notified about its status sbc.NotifyAfter = dtInsertRows.Rows.Count; // Event that gets fired when NotifyAfter number of records are processed. sbc.SqlRowsCopied += new SqlRowsCopiedEventHandler(sbc_SqlRowsCopied); // Finally write to server sbc.WriteToServer(dtInsertRows); sbc.Close(); } } /// <summary> /// Another ado.net 2.0 way that uses a stored procedure to do a bulk insert. /// Seems slower then "BatchBulkCopy" way and it crashes when you try to insert 500,000 records in one go. /// http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 /// </summary> private static void BatchInsert() { // Get the DataTable with Rows State as RowState.Added DataTable dtInsertRows = GetDataTable(); SqlConnection connection = new SqlConnection(connectionString); SqlCommand command = new SqlCommand("sp_BatchInsert", connection); command.CommandType = CommandType.StoredProcedure; command.UpdatedRowSource = UpdateRowSource.None; // Set the Parameter with appropriate Source Column Name command.Parameters.Add("@Name", SqlDbType.VarChar, 50, dtInsertRows.Columns[0].ColumnName); SqlDataAdapter adpt = new SqlDataAdapter(); adpt.InsertCommand = command; // Specify the number of records to be Inserted/Updated in one go. Default is 1. adpt.UpdateBatchSize = 500000; connection.Open(); int recordsInserted = adpt.Update(dtInsertRows); connection.Close(); } private static DataTable GetDataTable() { // You First need a DataTable and have all the insert values in it DataTable dtInsertRows = new DataTable(); dtInsertRows.Columns.Add("NAME"); for (int i = 0; i < 500000; i++) { DataRow drInsertRow = dtInsertRows.NewRow(); string name = "Name : " + i; drInsertRow["NAME"] = name; dtInsertRows.Rows.Add(drInsertRow); } return dtInsertRows; } static void sbc_SqlRowsCopied(object sender, SqlRowsCopiedEventArgs e) { Console.WriteLine("Number of records affected : " + e.RowsCopied.ToString()); } } }

    Read the article

  • Need WIF Training?

    - by Your DisplayName here!
    I spend numerous hours every month answering questions about WIF and identity in general. This made me realize that this is still quite a complicated topic once you go beyond the standard fedutil stuff. My good friend Brock and I put together a two day training course about WIF that covers everything we think is important. The course includes extensive lab material where you take standard application and apply all kinds of claims and federation techniques and technologies like WS-Federation, WS-Trust, session management, delegation, home realm discovery, multiple identity providers, Access Control Service, REST, SWT and OAuth. The lab also includes the latest version of the thinktecture identityserver and you will learn how to use and customize it. If you are looking for an open enrollment style of training, have a look here. Or contact me directly! The course outline looks as follows: Day 1 Intro to Claims-based Identity & the Windows Identity Foundation WIF introduces important concepts like conversion of security tokens and credentials to claims, claims transformation and claims-based authorization. In this module you will learn the basics of the WIF programming model and how WIF integrates into existing .NET code. Externalizing Authentication for Web Applications WIF includes support for the WS-Federation protocol. This protocol allows separating business and authentication logic into separate (distributed) applications. The authentication part is called identity provider or in more general terms - a security token service. This module looks at this scenario both from an application and identity provider point of view and walks you through the necessary concepts to centralize application login logic both using a standard product like Active Directory Federation Services as well as a custom token service using WIF’s API support. Externalizing Authentication for SOAP Services One big benefit of WIF is that it unifies the security programming model for ASP.NET and WCF. In the spirit of the preceding modules, we will have a look at how WIF integrates into the (SOAP) web service world. You will learn how to separate authentication into a separate service using the WS-Trust protocol and how WIF can simplify the WCF security model and extensibility API. Day 2 Advanced Topics:  Security Token Service Architecture, Delegation and Federation The preceding modules covered the 80/20 cases of WIF in combination with ASP.NET and WCF. In many scenarios this is just the tip of the iceberg. Especially when two business partners decide to federate, you usually have to deal with multiple token services and their implications in application design. Identity delegation is a feature that allows transporting the client identity over a chain of service invocations to make authorization decisions over multiple hops. In addition you will learn about the principal architecture of a STS, how to customize the one that comes with this training course, as well as how to build your own. Outsourcing Authentication:  Windows Azure & the Azure AppFabric Access Control Service Microsoft provides a multi-tenant security token service as part of the Azure platform cloud offering. This is an interesting product because it allows to outsource vital infrastructure services to a managed environment that guarantees uptime and scalability. Another advantage of the Access Control Service is, that it allows easy integration of both the “enterprise” protocols like WS-* as well as “web identities” like LiveID, Google or Facebook into your applications. ACS acts as a protocol bridge in this case where the application developer doesn’t need to implement all these protocols, but simply uses a service to make it happen. Claims & Federation for the Web and Mobile World Also the web & mobile world moves to a token and claims-based model. While the mechanics are almost identical, other protocols and token types are used to achieve better HTTP (REST) and JavaScript integration for in-browser applications and small footprint devices. Also patterns like how to allow third party applications to work with your data without having to disclose your credentials are important concepts in these application types. The nice thing about WIF and its powerful base APIs and abstractions is that it can shield application logic from these details while you can focus on implementing the actual application. HTH

    Read the article

  • How can I can insert the contents of a file into another file right before a specific line

    - by jelbatnigi
    Hi, How can I can insert the contents of a file into another file right before a specific line using sed. example I have file1.xml that has the following: <field tagRef="376"> </field> <field tagRef="377"> </field> <field tagRef="58"> </field> <group ref="StandardMessageTrailer" required="true"/> </fieldList> and file2.xml has the following: <field tagRef="9647"> <description>Offset</description> </field> <field tagRef="9648"> <description>Offset Units/Direction</description> </field> <field tagRef="9646"> <description>Anchor Price</description> </field> how can i insert the contents of file2 into file1 just before so it will look like this: <field tagRef="376"> </field> <field tagRef="377"> </field> <field tagRef="58"> </field> <field tagRef="9647"> <description>Offset</description> </field> <field tagRef="9648"> <description>Offset Units/Direction</description> </field> <field tagRef="9646"> <description>Anchor Price</description> </field> <group ref="StandardMessageTrailer" required="true"/> </fieldList> I know how to insert after that line using sed 'group ref="StandardMessageTrailer"/r file2.xml' file1.xml newfile.xml but I want to insert it before. appreciate the help

    Read the article

  • Clear the data from the control after the insert the row?

    - by kathy
    Hi,, I'm using aspxgridview and after adding operation by using RowInserting event and I don't use the gridView.CancelEdit(); becuase I need the addform still apperaing to insert another row. My problem is : how can I clean the data from the control in the addform until I can insert the new data for the second row ?? thanks

    Read the article

  • How to bulk insert from CSV when some fields have new line character?

    - by z-boss
    I have a CSV dump from another DB that looks like this (id, name, notes): 1001,John Smith,15 Main Street 1002,Jane Smith,"2010 Rockliffe Dr. Pleasantville, IL USA" 1003,Bill Karr,2820 West Ave. The last field may contain carriage returns and commas, in which case it is surrounded by double quotes. I use this code to import CSV into my table: BULK INSERT CSVTest FROM 'c:\csvfile.csv' WITH ( FIELDTERMINATOR = ',', ROWTERMINATOR = '\n' ) SQL Server 2005 bulk insert cannot figure out that carriage returns inside quotes are not row terminators. How to overcome?

    Read the article

  • how to insert using linq to entity foreign key value?

    - by Gamble
    I have a table (TestTable) for example: ID Name Parentid (FK) and I would like to insert a new record like ID(1) Name(Test) ParentID(5) FK. How can insert a new record into TestTable with linq to entity? var testTable = new TestTable(); testTable.ID = 1; testTable.Name = "TestName"; testTable ... thank you for the working example.

    Read the article

  • Why can't I insert record with foregion key in a single server request?

    - by Eran Betzalel
    I'm tryring to do a simple insert with foregion key, but it seems that I need to use db.SaveChanges() for every record insert. How can I manage to use only one db.SaveChanges() at the end of this program? foreach (var file in files) { db.AddToFileSet(file); db.SaveChanges(); db.AddToDirectorySet( new GlxCustomerPhone { SimIdentifier = file.Name + "Dir", CreationDate = DateTime.UtcNow, file_relation = file }); db.SaveChanges(); }

    Read the article

  • Problem with insert Thai Language data to SQL Server 2008 field datatype text and show ????

    - by embarus
    Hello everyone I created MVC ASP.Net Web application and tried insert Thai language data to SQL Server 2008 database to field with data type text and then database store ?????? which is incorrect. For Html Page I user charset utf-8 However I tried to Encode string before insert data to database and change database field collation. These do not solve problem. I'm looking forward to your reply. Thanks, embarus

    Read the article

  • How do I use Django to insert a Geometry Field into the database?

    - by alex
    class LocationLog(models.Model): user = models.ForeignKey(User) utm = models.GeometryField(spatial_index=True) This is my database model. I would like to insert a row. I want to insert a circle at point -55, 333. With a radius of 10. How can I put this circle into the geometry field? Of course, then I would want to check which circles overlap a given circle. (my select statement)

    Read the article

  • how to insert an integer value from a Grid to Sql table?

    - by z-chaos
    Hi, I have a AdvWebGrid where the 7th coloumn is DynEdit where user will enter the value.Now I have to take the enetered value and insert into the Sql tablel. For example i have 7 records in the grid the user will enter some comments for the first three record and save.Now i have to insert/update the first three comments in the table.

    Read the article

  • SQL Insert into ... values ( SELECT ... FROM ... )

    - by Shadow_x99
    I am trying to insert into a table using the input from another table. Although this is entirely feasible for many database engines, I always seem to struggle to remember the correct syntax for the SQL-Engine of the day (MySQL, Oracle, SQLServer, Informix, DB2). I've been wondering if there is a silver-bullet syntax coming from an SQL Standards (For example, SQL92) that would allow me to insert the values without worrying about the underlying database.

    Read the article

  • Uploading a csv file to sql server - Identity problem.

    - by Doozer1979
    Given a column structure in a CSV file of: First_Name, Last_Name, Date_Of_Birth And a SQL Server table with a structure of ID(PK) | First_Name | Last_Name | Date_Of_Birth (Field ID is an Identity with an auto-increment of 1) How do i arrange it so that SQL Server does not attempt to insert the First_Name column from the csv file into the ID field? For info the csv is loaded into a DataTable and then copied to SQL Server using SqlBulkCopy Should i be modifying the csv file before the import add the ID column (The destination table is truncated prior to import, so no need to worry about duplicate key values.) Or perhaps adding an id column to the Datatable? Or Is there a setting in Sql Server that i may have missed?

    Read the article

  • Code Signing Identity does not match in my keychain, for mac app store developing?

    - by larntin
    hi, 1, I already download the "Apple Worldwide Developer Relations Certification Authority",and add it into my keychain. 2, My team leader already had created two Cers for Mac App store developing, I download and add it into my keychain. 3, I used two methods to sign my add, but failed all. First, add code sign section in my .xcodeproj(3.2.5). Second, I used script: productbuild --component ./bin/MAS_Release/MyApp.app /Applications --sign "3rd Party Mac Developer Application: My Company Co., Ltd." --product ./src/MyApp/MyApp-Info.plist MyApp.pkg But it failed with information: Code Signing Identity '3rd Party Mac Developer Application: My Company Co., Ltd.' does not match any valid, non-expired, code-signing certificate in your keychain. I observed that my certifications in keychain don't have small trangle. how make the small trangle absence?(when I'am importing the Cers from my Agent, it don't have the trangle absence)

    Read the article

  • How to insert a SQLite record with a datetime set to 'now' in Android application?

    - by droidguy
    Say, we have a table created as: create table notes (_id integer primary key autoincrement, created_date date) To insert a record, I'd use ContentValues initialValues = new ContentValues(); initialValues.put("date_created", ""); long rowId = mDb.insert(DATABASE_TABLE, null, initialValues); But how to set the date_created column to "now"? To make it clear, the initialValues.put("date_created", "datetime('now')"); Is not the right solution. It just sets the column to "datetime('now')" text.

    Read the article

  • Where can I find sample XHTML5 source codes?

    - by Bytecode Ninja
    Where can I find sample *X*HTML 5 pages? I mainly want to know if it is possible to mix and match XHTML 5 with other XML languages just like XHTML 1 or not. For example is something like this valid in XHTML 5? <!DOCTYPE html PUBLIC "WHAT SHOULD BE HERE?" "WHAT SHOULD BE HERE?"> <html xmlns="WHAT SHOULD BE HERE?" xmlns:ui="http://java.sun.com/jsf/facelets"> <head> <title><ui:insert name="title">Default title</ui:insert></title> <link rel="stylesheet" type="text/css" href="./css/main.css"/> </head> <body> <div id="header"> <ui:insert name="header"> <ui:include src="header.xhtml"/> </ui:insert> </div> <div id="left"> <ui:insert name="navigation" > <ui:include src="navigation.xhtml"/> </ui:insert> </div> <div id="center"> <br /> <span class="titleText"> <ui:insert name="title" /> </span> <hr /> <ui:insert name="content"> <div> <ui:include src="content.xhtml"/> </div> </ui:insert> </div> <div id="right"> <ui:insert name="news"> <ui:include src="news.xhtml"/> </ui:insert> </div> <div id="footer"> <ui:insert name="footer"> <ui:include src="footer.xhtml"/> </ui:insert> </div> </body> </html> Thanks in advance.

    Read the article

  • Grails unit testing domain classes with Set properties - is this safe?

    - by Ali G
    I've created a domain class in Grails like this: class MyObject { static hasMany = [tags: String] // Have to declare this here, as nullable constraint does not seem to be honoured Set tags = new HashSet() static constraints = { tags(nullable: false) } } Writing unit tests to check the size and content of the MyObject.tags property, I found I had to do the following: assertLength(x, myObject.tags as Object[]) assertEquals(new HashSet([...]), myObject.tags) To make the syntax nicer for writing the tests, I implemented the following methods: void assertEquals(List expected, Set actual) { assertEquals(new HashSet(expected), actual) } void assertLength(int expected, Set set) { assertLength(expected, set as Object[]) } I can now call the assertLength() and assertEquals() methods directly on an instance of Set, e.g. assertLength(x, myObject.tags) assertEquals([...], myObject.tags) I'm new to Groovy and Grails, so unaware how dangerous method overloading like this is. Is it safe? If so, I'm slightly* surprised that these methods (or similar) aren't already available - please let me know if they are. * I can see how these methods could also introduce ambiguity if people weren't expecting them. E.g. assertLength(1, set) always passes, no matter what the content of set

    Read the article

  • Vim: Smart indent when entering insert mode on blank line?

    - by TheDeeno
    When I open a new line (via 'o') my cursor jumps to a correctly indented position on the next line. On the other hand, entering insert mode while my cursor is on a blank line doesn't move my cursor to the correctly indented location. How do I make vim correctly indent my cursor when entering insert mode (via i) on a blank line?

    Read the article

  • Using IPrinciple.Identity.Name as a key in a dataBase to identify user's rows.

    - by bplus
    I'm writing a small intranet app that uses Windows Authentication and Asp.Net MVC. I need to store various bits of data in a db against each user. As far as I can tell the IPrinciple object does not seem to have something like a unique id. So I was thinking I could just use User.Identity.Name as a unique value to identify rows in my db. Is this a bad idea? Is there an alternative to this approach? Thanks for any help.

    Read the article

  • Is it possible to use SQL XML to insert, and get output from each record?

    - by nbolton
    I would like to perform a SQL XML insert (on MSSQL), and in this case I need to insert a list of files into the DB (this is simple enough). However, there's an auto generated PK column (ID), and I need the ID for each newly created filename without performing a 2nd query. Is this possible? I guess it doesn't matter if the result is/isn't XML, but the input certainly has to be.

    Read the article

  • Where do you manually insert assertions into an automated coded ui test's code in VS2010?

    - by user1649536
    I am currently automating smoke tests and I am trying to learn how to manually insert assertions with C# into the UImap.Designer.cs file. I am trying to learn how to do this manually but I have no direction on where to put the assertions and all the literature I am finding only covers how to add assertions with the CodedUI Test Builder tool that is included with VS2010. Can anyone direct me to where I need to insert the assertions?

    Read the article

< Previous Page | 187 188 189 190 191 192 193 194 195 196 197 198  | Next Page >