does it makes sense to use int instead of char or nvarchar for a discriminator column if I'm using i
Posted
by Omu
on Stack Overflow
See other posts from Stack Overflow
or by Omu
Published on 2010-03-29T12:36:17Z
Indexed on
2010/03/29
13:23 UTC
Read the original article
Hit count: 404
I have something like this:
create table account
(
id int identity(1,1) primary key,
usertype char(1) check(usertype in ('a', 'b')) not null,
unique(id, usertype)
)
create table auser
(
id int primary key,
usertype char(1) check(usertype = 'a') not null,
foreign key (id, usertype) references account(id, usertype)
)
create table buser
(
... same just with b
)
the question is: if I'm going to use int instead of char(1), does it going to work faster/better ?
© Stack Overflow or respective owner