Search Results

Search found 64 results on 3 pages for 'lieven'.

Page 2/3 | < Previous Page | 1 2 3  | Next Page >

  • Create pdf from inDesign at runtime...

    - by Lieven Cardoen
    For a project I need to automate creation of business cards. Now, they have a InDesign file for each business card template. They insert the info of alle the people in the indesign file and then generate a pdf of it. Now, entering the information of customers in a web application is no problem, but how will I generate a pdf and how will I alter the indesign file at runtime? I think that altering the indesign file will be not possible programmatically? Could I generate a pdf from the indesign with one card template in it. At runtime I would copy the card in the pdf x number of times. Then I would need to inject the information of the people (name, address, ...)? What's possible here? The final pdf is used by a machine that automatically creates the business cards, cuts them, ...

    Read the article

  • Output Caching with IIS7 - How To for an dynamic aspx page?

    - by Lieven Cardoen
    I have a RetrieveBlob.aspx that gets some query string variables and returns an asset. Eeach url corresponds to a unique asset. In the RetrieveBlob.aspx a Cache Profile is set. In Web.Config the profile looks like (under system.web tag: <caching> <outputCache enableOutputCache="true" /> <outputCacheSettings> <outputCacheProfiles> <add duration="14800" enabled="true" varyByParam="*" name="AssetCacheProfile" /> </outputCacheProfiles> </outputCacheSettings> </caching> Ok, this works fine. When I put a breakpoint in the code behind of RetrieveBlob.aspx, it gets triggered the first time, and all the other times not. Now, I throw away the Cache Profile and instead I'm having this in my Web.Config under System.WebServer: <caching> <profiles> <add extension=".swf" policy="CacheForTimePeriod" kernelCachePolicy="CacheForTimePeriod" duration="00:08:00" /> <add extension=".flv" policy="CacheForTimePeriod" kernelCachePolicy="CacheForTimePeriod" duration="00:08:00" /> <add extension=".gif" policy="CacheForTimePeriod" kernelCachePolicy="CacheForTimePeriod" duration="00:08:00" /> <add extension=".png" policy="CacheForTimePeriod" kernelCachePolicy="CacheForTimePeriod" duration="00:08:00" /> <add extension=".mp3" policy="CacheForTimePeriod" kernelCachePolicy="CacheForTimePeriod" duration="00:08:00" /> <add extension=".jpeg" policy="CacheForTimePeriod" kernelCachePolicy="CacheForTimePeriod" duration="00:08:00" /> <add extension=".jpg" policy="CacheForTimePeriod" kernelCachePolicy="CacheForTimePeriod" duration="00:08:00" /> </profiles> </caching> Now the caching doesn't work anymore. What am I doing wrong? Is it possible to configure under Caching tag of System.WebServer a Caching Profile for a Dynamic aspx page? I already tried adding something like this: <add extension="RetrieveBlob.aspx" policy="CacheForTimePeriod" kernelCachePolicy="CacheForTimePeriod" duration="00:00:30" varyByQueryString="assetId, assetFileId" /> But it doesn't work. An example of an url is: http://{server}/{application}/trunk/RetrieveBlob.aspx?assetId=31809&assetFileId=11829

    Read the article

  • updateBinaryStream on ResultSet with FileStream

    - by Lieven Cardoen
    If I try to update a FileStream column I get following error: com.microsoft.sqlserver.jdbc.SQLServerException: The result set is not updatable. Code: System.out.print("Now, let's update the filestream data."); FileInputStream iStream = new FileInputStream("c:\\testFile.mp3"); rs.updateBinaryStream(2, iStream, -1); rs.updateRow(); iStream.close(); Why is this? Table in Sql Server 2008: CREATE TABLE [BinaryAssets].[BinaryAssetFiles]( [BinaryAssetFileId] [int] IDENTITY(1,1) NOT NULL, [FileStreamId] [uniqueidentifier] ROWGUIDCOL NOT NULL, [Blob] [varbinary](max) FILESTREAM NULL, [HashCode] [nvarchar](100) NOT NULL, [Size] [int] NOT NULL, [BinaryAssetExtensionId] [int] NOT NULL, Query used in Java: String strCmd = "select BinaryAssetFileId, Blob from BinaryAssets.BinaryAssetFiles where BinaryAssetFileId = 1"; stmt = con.createStatement(); rs = stmt.executeQuery(strCmd);

    Read the article

  • SQL Server 2008 Management Studio doesn't recognize new Schema

    - by Lieven Cardoen
    I have created a new Schema in a database called Contexts. Now when I want to write a query, Management Studio doesn't recognize the tables that belong to the new Schema. It says: 'Invalid object name Contexts.ContextLibraries'... Transact-SQL: INSERT INTO [Contexts].[ContextLibraries] (ChannelId, [IsSystem]) VALUES (@ChannelId, 1) When I try the same thing on my local database, it does work... Any ideas? I did try to change the Default schema for the user from dbo to Contexts but this doesn't work. Also checked Contexts in Schemas owned by this user without success. Update: Apparently the sql query does work but the editor gives a fault saying the object is invalid.

    Read the article

  • Database Version Control SQL Server 2008 Drop SP's and Functions

    - by Lieven Cardoen
    I'm working on versioning our database and now searching for a way to drop all stored procedures and functions from a C# Console Application. I'd rather not create a stored procedure that drops all stored procedures and functions. I has to be some sql executed from C#. I tried to drop the stored procedure before creating it, but I get this message: System.Data.SqlClient.SqlException: 'CREATE/ALTER PROCEDURE' must be the first statement in a query batch. Script for one SP for example: DROP PROCEDURE [dbo].[sp_Economatic_LoadJournalEntryFeedbackByData] SET ANSI_NULLS ON SET QUOTED_IDENTIFIER ON CREATE PROCEDURE [dbo].[sp_Economatic_LoadJournalEntryFeedbackByData] @Data VARCHAR(MAX) AS BEGIN ... END So I guess before creating all SP's and functions I'll need to drop all SP's and functions first with one sql script.

    Read the article

  • Executing bat file and returning the prompt

    - by Lieven Cardoen
    I have a problem with cruisecontrol where an ant scripts executes a bat file that doesn't give me the prompt back. As a result, the project in cruisecontrol keeps on bulding forever until I restart cruisecontrol. How can I resolve this? It's a startup.bat from wowza (Streaming Server) that I'm executing: @echo off call setenv.bat if not %WMSENVOK% == "true" goto end set _WINDOWNAME="Wowza Media Server 2" set _EXESERVER= if "%1"=="newwindow" ( set _EXESERVER=start %_WINDOWNAME% shift ) set CLASSPATH="%WMSAPP_HOME%\bin\wms-bootstrap.jar" rem cacls jmxremote.password /P username:R rem cacls jmxremote.access /P username:R rem NOTE: Here you can configure the JVM's built in JMX interface. rem See the "Server Management Console and Monitoring" chapter rem of the "User's Guide" for more information on how to configure the rem remote JMX interface in the [install-dir]/conf/Server.xml file. set JMXOPTIONS=-Dcom.sun.management.jmxremote=true rem set JMXOPTIONS=%JMXOPTIONS% -Djava.rmi.server.hostname=192.168.1.7 rem set JMXOPTIONS=%JMXOPTIONS% -Dcom.sun.management.jmxremote.port=1099 rem set JMXOPTIONS=%JMXOPTIONS% -Dcom.sun.management.jmxremote.authenticate=false rem set JMXOPTIONS=%JMXOPTIONS% -Dcom.sun.management.jmxremote.ssl=false rem set JMXOPTIONS=%JMXOPTIONS% -Dcom.sun.management.jmxremote.password.file= "%WMSCONFIG_HOME%/conf/jmxremote.password" rem set JMXOPTIONS=%JMXOPTIONS% -Dcom.sun.management.jmxremote.access.file= "%WMSCONFIG_HOME%/conf/jmxremote.access" rem log interceptor com.wowza.wms.logging.LogNotify - see Javadocs for ILogNotify %_EXESERVER% "%_EXECJAVA%" %JAVA_OPTS% %JMXOPTIONS% -Dcom.wowza.wms.AppHome="%WMSAPP_HOME%" -Dcom.wowza.wms.ConfigURL="%WMSCONFIG_URL%" -Dcom.wowza.wms.ConfigHome="%WMSCONFIG_HOME%" -cp %CLASSPATH% com.wowza.wms.bootstrap.Bootstrap start :end

    Read the article

  • Problem performance datawarehouse with lots of indexes

    - by Lieven Cardoen
    Our product takes tests of some 350 candidates at the same time. At the end of the test, results for each candidate are moved to a datawarehouse full of indexes on it. For each test there's some 400 records to be entered in datawarehouse. So 400 x 350 is a lot of records. If there are not much records in the datawarehouse, all goes well. But if there are already lots of records in the datawarehouse, then a lot of inserts fail... Is there a way to have indexes that are only rebuild at the end of the day or isn't that the real problem? Or how would you solve this?

    Read the article

  • Is Streaming Video possible with Sql Filestream?

    - by Lieven Cardoen
    We have stored all media in Sql Filestream, but now we'll need Video and Audio streaming... Will this be possible with Sql Filestream or will I have to take all of the Video and Audio out of the database? Which technology would you use to enable Video/Audio Streaming? WebORB FluorineFX Wowza (way better I think than the first two) IIS Media (haven't looked into this yet)

    Read the article

  • Sql Profiler Scan Started to execute a stored procedure

    - by Lieven Cardoen
    Does SqlServer has to start a Scan to execute a stored procedure? In Sql Profiler I can see this: RPC Starting ( exec sp_Edu3_SelectExamSession @ExamSessionId=N'AccessCode39361814' ) Scan:Started Scan:Started Scan:Started RPC Completed ( exec sp_Edu3_SelectExamSession @ExamSessionId=N'AccessCode39361814' ) Can I somehow see what's happening in the Stored Procedure? Different queries are done in that SP, but they do not seem to appear in Sql Profiler (maybe I need to check some more events?) The Scan:Started are probably scans by the queries in the sp? Or not?

    Read the article

  • Caching Profiles web.config vs IIS

    - by Lieven Cardoen
    What is the difference between configuring a Caching Profile in Web.Config and configuring it in IIS? If you have this in Web.Config <caching> <outputCache enableOutputCache="true" /> <outputCacheSettings> <outputCacheProfiles> <add duration="14800" enabled="true" varyByParam="*" name="AssetCacheProfile" /> </outputCacheProfiles> </outputCacheSettings> </caching> And nothing configured in IIS in the Output Caching, will it work? And what if you add all the extensions I use in Output Caching in IIS, what does that change? It's a aspx page RetrieveBlob.aspx that uses this Caching Profile: <%@ OutputCache CacheProfile="AssetCacheProfile" %> <%@ Page Language="C#" AutoEventWireup="true" CodeFile="RetrieveBlob.aspx.cs" Inherits="RetrieveBlob" %>

    Read the article

  • Sql server query using function and view is slower

    - by Lieven Cardoen
    I have a table with a xml column named Data: CREATE TABLE [dbo].[Users]( [UserId] [int] IDENTITY(1,1) NOT NULL, [FirstName] [nvarchar](max) NOT NULL, [LastName] [nvarchar](max) NOT NULL, [Email] [nvarchar](250) NOT NULL, [Password] [nvarchar](max) NULL, [UserName] [nvarchar](250) NOT NULL, [LanguageId] [int] NOT NULL, [Data] [xml] NULL, [IsDeleted] [bit] NOT NULL,... In the Data column there's this xml <data> <RRN>...</RRN> <DateOfBirth>...</DateOfBirth> <Gender>...</Gender> </data> Now, executing this query: SELECT UserId FROM Users WHERE data.value('(/data/RRN)[1]', 'nvarchar(max)') = @RRN after clearing the cache takes (if I execute it a couple of times after each other) 910, 739, 630, 635, ... ms. Now, a db specialist told me that adding a function, a view and changing the query would make it much more faster to search a user with a given RRN. But, instead, these are the results when I execute with the changes from the db specialist: 2584, 2342, 2322, 2383, ... This is the added function: CREATE FUNCTION dbo.fn_Users_RRN(@data xml) RETURNS varchar(100) WITH SCHEMABINDING AS BEGIN RETURN @data.value('(/data/RRN)[1]', 'varchar(max)'); END; The added view: CREATE VIEW vwi_Users WITH SCHEMABINDING AS SELECT UserId, dbo.fn_Users_RRN(Data) AS RRN from dbo.Users Indexes: CREATE UNIQUE CLUSTERED INDEX cx_vwi_Users ON vwi_Users(UserId) CREATE NONCLUSTERED INDEX cx_vwi_Users__RRN ON vwi_Users(RRN) And then the changed query: SELECT UserId FROM Users WHERE dbo.fn_Users_RRN(Data) = '59021626919-61861855-S_FA1E11' Why is the solution with a function and a view going slower?

    Read the article

  • SqlServer slow on production environment

    - by Lieven Cardoen
    I have a weird problem in a production environment at a customer. I can't give any details on the infrastructure except that sql server runs on a virtual server and the data, log and filestream file are on another storage server (data and filestream together and log on a seperate server). Now, there's this query that when we run it gives these durations (first we clear the cache): 300ms, 20ms, 15ms, 17ms,... First time it takes longer, but from then on it is cached. At the customer, on a sql server that is more powerfull, these are the durations (I didn't have the rights to clear the cache. Will try this tomorrow). 2500ms, 2600ms, 2400ms, ... The query can be improved, that's right, but that's not the question here. How would you tackle this? I don't know where to go from here. The servers at this customer are really more powerfull but they do have virtual servers (we don't). What could be the cause... - Not enough memory? - Fragmentation? - Physical storage? I know it can be a lot of things, but maybe some of you have got some info for me on how to go on with this issue...

    Read the article

  • Queuing using table or MSMQ?

    - by Lieven Cardoen
    A part of the application I'm working on is an swf that shows a test with some 80 questions. Each question is saved to a sql server through WebORB and asp.net. If a candidate finisheds the test, the session needs to be validated. Problem now is that sometimes 350 candidates finish their test at the same moment, and cpu on webserver and sql server explodes (350 validations concurrently). Now, how would I implement queuing here? In the database, there's a table that has a record for each session. One column holds the status. 1 is finished, 2 is validated. I could implement queuing in two ways (as I see it, maybe you have other propositions): A process that checks the table for records with status 1. If it finds one, it validates the session. So, sessions are validated one after one. If a candidate finishes its session, a message is sent to a MSMQ queue. Another process listens to the queue and validates sessions one after one. Now, What would be the best approach? Where do you start the process that will validate sessions? In your global.asax (application_start)? As a windows service? As an exe on the root of the website that is started in application_start? To me, using the table and looking for records with status 1 seems the easiest way.

    Read the article

  • SocketTimeout: Read timed out

    - by Lieven Cardoen
    I'm using Flex - IIS - ASP.NET to do remote calls. When I stresstest, all remote calls that take longer than 30 seconds fail. In Charles I get a message saying 'SocketTimeout: Read timed out'. Is this something that can be configured in IIS? Or could it be a problem with a setting in Charles?

    Read the article

  • If a table has two xml columns, will inserting records be a lot slower?

    - by Lieven Cardoen
    Is it a bad thing to have two xml columns in one table? + How much slower are these xml columns in terms of updating/inserting/reading data? In profiler this kind of insert normally takes 0 ms, but sometimes it goes up to 160ms: declare @p8 xml set @p8=convert(xml,N'<interactions><interaction correct="false" score="0" id="0" gapid="0" x="61" y="225"><feedback/><element id="0" position="0" elementtype="1"><asset/></element></interaction><interaction correct="false" score="0" id="1" gapid="1" x="64" y="250"><feedback/><element id="0" position="0" elementtype="1"><asset/></element></interaction><interaction correct="false" score="0" id="2" gapid="2" x="131" y="250"><feedback/><element id="0" position="0" elementtype="1"><asset/></element></interaction></interactions>') declare @p14 xml set @p14=convert(xml,N'<contentinteractions/>') exec sp_executesql N'INSERT INTO [dbo].[PackageSessionNodes]([dbo].[PackageSessionNodes].[PackageSessionId], [dbo].[PackageSessionNodes].[TreeNodeId],[dbo].[PackageSessionNodes].[Duration], [dbo].[PackageSessionNodes].[Score],[dbo].[PackageSessionNodes].[ScoreMax], [dbo].[PackageSessionNodes].[Interactions],[dbo].[PackageSessionNodes].[BrainTeaser], [dbo].[PackageSessionNodes].[DateCreated], [dbo].[PackageSessionNodes].[CompletionStatus], [dbo].[PackageSessionNodes].[ReducedScore], [dbo].[PackageSessionNodes].[ReducedScoreMax], [dbo].[PackageSessionNodes].[ContentInteractions]) VALUES (@ins_dboPackageSessionNodesPackageSessionId, @ins_dboPackageSessionNodesTreeNodeId, @ins_dboPackageSessionNodesDuration, @ins_dboPackageSessionNodesScore, @ins_dboPackageSessionNodesScoreMax, @ins_dboPackageSessionNodesInteractions, @ins_dboPackageSessionNodesBrainTeaser, @ins_dboPackageSessionNodesDateCreated, @ins_dboPackageSessionNodesCompletionStatus, @ins_dboPackageSessionNodesReducedScore, @ins_dboPackageSessionNodesReducedScoreMax, @ins_dboPackageSessionNodesContentInteractions) ; SELECT SCOPE_IDENTITY() as new_id This is the table: CREATE TABLE [dbo].[PackageSessionNodes]( [PackageSessionNodeId] [int] IDENTITY(1,1) NOT NULL, [PackageSessionId] [int] NOT NULL, [TreeNodeId] [int] NOT NULL, [Duration] [int] NULL, [Score] [float] NOT NULL, [ScoreMax] [float] NOT NULL, [Interactions] [xml] NOT NULL, [BrainTeaser] [bit] NOT NULL, [DateCreated] [datetime] NULL, [CompletionStatus] [int] NOT NULL, [ReducedScore] [float] NOT NULL, [ReducedScoreMax] [float] NOT NULL, [ContentInteractions] [xml] NOT NULL, CONSTRAINT [PK_PackageSessionNodes] PRIMARY KEY CLUSTERED ( [PackageSessionNodeId] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] GO ALTER TABLE [dbo].[PackageSessionNodes] WITH CHECK ADD CONSTRAINT [FK_PackageSessionNodes_PackageSessions] FOREIGN KEY([PackageSessionId]) REFERENCES [dbo].[PackageSessions] ([PackageSessionId]) ON UPDATE CASCADE ON DELETE CASCADE GO ALTER TABLE [dbo].[PackageSessionNodes] CHECK CONSTRAINT [FK_PackageSessionNodes_PackageSessions] GO ALTER TABLE [dbo].[PackageSessionNodes] WITH CHECK ADD CONSTRAINT [FK_PackageSessionNodes_TreeNodes] FOREIGN KEY([TreeNodeId]) REFERENCES [dbo].[TreeNodes] ([TreeNodeId]) GO ALTER TABLE [dbo].[PackageSessionNodes] CHECK CONSTRAINT [FK_PackageSessionNodes_TreeNodes] GO ALTER TABLE [dbo].[PackageSessionNodes] ADD CONSTRAINT [DF_PackageSessionNodes_Score] DEFAULT ((-1)) FOR [Score] GO ALTER TABLE [dbo].[PackageSessionNodes] ADD CONSTRAINT [DF_PackageSessionNodes_ScoreMax] DEFAULT ((-1)) FOR [ScoreMax] GO ALTER TABLE [dbo].[PackageSessionNodes] ADD CONSTRAINT [DF_PackageSessionNodes_DateCreated] DEFAULT (getdate()) FOR [DateCreated] GO ALTER TABLE [dbo].[PackageSessionNodes] ADD CONSTRAINT [DF_PackageSessionNodes_ReducedScore] DEFAULT ((-1)) FOR [ReducedScore] GO ALTER TABLE [dbo].[PackageSessionNodes] ADD CONSTRAINT [DF_PackageSessionNodes_ReducedScoreMax] DEFAULT ((-1)) FOR [ReducedScoreMax] GO

    Read the article

  • Random select is not always returning a single row.

    - by Lieven
    The intention of following (simplified) code fragment is to return one random row. Unfortunatly, when we run this fragment in the query analyzer, it returns between zero and three results. As our input table consists of exactly 5 rows with unique ID's and as we perform a select on this table where ID equals a random number, we are stumped that there would ever be more than one row returned. Note: among other things, we already tried casting the checksum result to an integer with no avail. DECLARE @Table TABLE ( ID INTEGER IDENTITY (1, 1) , FK1 INTEGER ) INSERT INTO @Table SELECT 1 UNION ALL SELECT 2 UNION ALL SELECT 3 UNION ALL SELECT 4 UNION ALL SELECT 5 SELECT * FROM @Table WHERE ID = ABS(CHECKSUM(NEWID())) % 5 + 1

    Read the article

< Previous Page | 1 2 3  | Next Page >