I need some suggestion over this approach - please see examples below.
I have table structure as below
data_jobtype
id, identifier (varchar), description (varchar), userid (int)
data_statustype
id, identifier (varchar), description (varchar), userid (int)
data_usertype
id, identifier (varchar), description (varchar), userid (int)
data_roletype
id, identifier (varchar), description (varchar), userid (int)
there are about 10 more similar tables. Then I had another idea and created a new table like
data_types
id, identifier (varchar), description (varchar), userid (int), typetype (varchar)
this table takes all the data from the above tables and typetype
fields tells which type of data it is. Eg. typetype
= 'jobtype' or typetype
= 'roletype`
this second approach works just fine but when I was writing a query to refer to the same table twice to create a join on two different types of types
I realised I need to understand if querying single table multiple times in a query is better than multiple tables. Example query:
select u.*, dt.description usertype_desc, dt2.description roletype_desc
from users u
left join data_types dt on dt.identifier = u.user_identifier and dt.typetype = 'usertype'
left join data_types dt2 on dt2.identifier = u.role_identifier and dt2.typetype = 'roletype'
Where u.status = 'live'
Its not only this query I am worried about. The project is still in its early days but it will eventually grow big so I have time to put the basic structure right now so I'd appreciate your comments on if I am doing it right. which approach you'd recommend over other and why? thanks
Much simpler to link in both types in the join condition:
left join data_types dt
on dt.identifier = u.user_identifier and dt.typetype in ('usertype', 'roletype')
I don't think there are performance implications for the single table approach. The question is what makes for more readable queries.